How is THIS Coding Assistant FREE?
144,603
Published 2023-12-14
š Gear Links š
* šš„ New MacBook Air M1 Deal: amzn.to/3S59ID8
* š»š Refurb MacBook Air M1 Deal: amzn.to/45K1Gmk
* š§ā” Great 40Gbps T4 enclosure: amzn.to/3JNwBGW
* š ļøš My nvme ssd: amzn.to/3YLEySo
* š¦š® My gear: www.amazon.com/shop/alexziskind
š„ Related Videos š„
š LLMs: From Zero to Hero with M3 - Ā Ā Ā ā¢Ā ZeroĀ toĀ HeroĀ LLMsĀ withĀ M3Ā MaxĀ BEASTĀ Ā
š² MacBook Performance: Developer Shocked! - Ā Ā Ā ā¢Ā DeveloperĀ ShockedĀ byĀ myĀ MacBookĀ Ā
š¤Æ Mind-Blowing Machine Learning on Neural Engine - Ā Ā Ā ā¢Ā INSANEĀ MachineĀ LearningĀ onĀ NeuralĀ Eng...Ā Ā
š Apple Silicon: Who Needs It? - Ā Ā Ā ā¢Ā AppleĀ SiliconĀ isĀ back,Ā butĀ forĀ whichĀ ...Ā Ā
š AI for Coding Playlist: š - Ā Ā Ā ā¢Ā AIĀ Ā
#programming #ai #softwaredevelopment
ā ā ā ā ā ā ā ā ā
ā¤ļø SUBSCRIBE TO MY YOUTUBE CHANNEL šŗ
Click here to subscribe: youtube.com/@azisk?sub_confirmation=1
ā ā ā ā ā ā ā ā ā
š±LET'S CONNECT ON SOCIAL MEDIA
ALEX ON TWITTER: twitter.com/digitalix
CODY ON TWITTER: twitter.com/sourcegraphcody
All Comments (21)
-
Hey man! I'm a senior in computer science. I enjoy your content so much. I really look forward to you making videos on Apple's MLX framework. I have a 14" M2 Max 30 core GPU and I'd really like to see how much elbow grease it could do vs other Macs.
-
It looks very promising. Testing it right now. Thanks for sharing it!
-
My concern is that in order for the suggestions to be presented, your code has to be sent via the plug-in to Cody/CoPilot and āfedā to the LLM on their servers. How do we know that these services arenāt stockpiling usersā code that may one day be divulged to the internet due to a security leak?
-
I have been using a plug-in called pieces that does snippet storage, but they also have a chat feature where u can use cloud or local LLMs along with adding portions of your code to the modelās sight
-
Your videos look so polished. Great editing, voice quality and content. A pleasure to watch š
-
I am glad that I subscribed to you.
-
At the rate we are going, I think it will only be a few years to where Local LLMs will be "good" enough coding assistants (hopefully that will end subscription costs š¤·š¼āā)
-
Great video as always !
-
I love all of your tech useful videos as geeky as I am such an amazing brain food. Kudos!
-
Thanks it works well and is rly useful !
-
I wouldnāt bet that they will do local LLMs to the free tier because it may discourage people from purchasing a premium tier if they can just run it using their own model locally. The free tier gives you a ātasteā and the artificial limit is to encourage you to buy the premium tier. I do like that you can try before you buy
-
In theory, if your local LLM application presents an open ai compatible api, you could that in combination with something like cody. Suppose even if it was locked down, you could edit your hosts file and point open api servers to your local llm.
-
You can actually correct to local llm already. I connected to the code llama and it works well
-
Iām concerned about privacy when Cody āaddressesā my entire codebase.
-
I've used it. It's great.
-
Always love to watch your videos.... Love from India
-
The new 14th Gen Intel processors for laptops just got released. Hope to see you get oneš
-
Great video, local llm ftw!
-
you are a man from heaven š® that was my concern bro
-
You can compare MacBook air M1 8gb RAM VS MacBook pro M3 18gb RAM, I want to identify if it is really worth the upgrade