Previous 1 ... 12 13 14 15 16 Next
This seems to be a significant development? It’s affecting stock markets today.
TLDR OpenAI, Anthropic, etc. spend $100M+ just on compute DeepSeek just showed up and said "LOL what if we did this for $5M instead?" How? They rethought everything from the ground up. Traditional AI is like writing every number with 32 decimal places. DeepSeek was like "what if we just used 8? It's still accurate enough!" Boom - 75% less memory needed.
They built an "expert system." Instead of one massive AI trying to know everything (like having one person be a doctor, lawyer, AND engineer), they have specialized experts that only wake up when needed.Traditional models? All 1.8 trillion parameters active ALL THE TIME. DeepSeek? 671B total but only 37B active at once. It's like having a huge team but only calling in the experts you actually need for each task.
The results are mind-blowing:
- Training cost: $100M → $5M
- GPUs needed: 100,000 → 2,000
- API costs: 95% cheaper
- Can run on gaming GPUs instead of data center hardware
Why does this matter? Because it breaks the model of "only huge tech companies can play in AI." You don't need a billion-dollar data center anymore. A few good GPUs might do it.
AI has always been a bubble, just like 'Quantum' is. But at least it isn't the con job that Crypto is.
Gonna buy me some more NVDA at 17% off too!
Think of AI as basically a vector calculation. A book is taken, each word is assigned a real (decimal) number, and put into an array on order and frequency of use. Every new paper, book, etc. that is added is made into a new layer of that array.
AI is the vector through all the arrays that gives the most likely outcome of words in order.
Video games are also vector math. Everything you see is basically a triangle at a given size and angle and colour. NVidia makes chips that are amazing at drawing these vectors. AMD is good, NVidia is awesome.
So NVidia chips are also good at the vector math needed for AI. A very simplified explanation, but hopefully it helps. What the Chinese have done is write the code that builds these vector calcualtions to be super fast, where OpenAI and others relied on the brute force of NVidia silicon to do the same thing.
So it seems that ChatGPT was the first one to lose it's job to AI.
And perhaps you also know why large language model AI is inherently stupid.
How one YouTuber is trying to poison the AI bots stealing her content
DeepSeek iOS app sends data unencrypted to ByteDance-controlled servers
Shocking. :\
Bought an Orin Nano. For those that don't know, it uses an NVidia AI processor, and is fantastic for setting up your own personal AI. I got the development board, so there are pins I can use to interface it to things like cameras and sensors like a RaspberryPI.
Now I just have to figure out how to use it for World Domination.
So, the company I bought the Nano from just sent me a merchant message through Newegg just to assure me that the delivery was proceeding normally.
Embedded in the message was a 1X1 pixel blank image. Just to let them know who I am.
But I load emails as plain text. Checkmate!
AI 'hallucinations' in court papers spell trouble for lawyers
Turns out, for $1000/h, lawyers are also lazy AF.
Judges Are Fed up With Lawyers Using AI That Hallucinate Court Cases
Fake job seekers are flooding U.S. companies that are hiring for remote positions, tech CEOs say
Facebook Pushes Its Llama 4 AI Model to the Right, Wants to Present “Both Sides”
Truth has a liberal bias.
Previous 1 ... 12 13 14 15 16 Next