- Angela Benton on AI
- Posts
- Google Enters the Chat with Gemini, the Curious Case of ChatGPT, and the Dawn of Personalized Local AI
Google Enters the Chat with Gemini, the Curious Case of ChatGPT, and the Dawn of Personalized Local AI
A glimpse into how AI will evolve in 2024
Welcome back humans.
Remember this time last year when OpenAI just launched? It’s been a whirlwind with no signs of slowing down and this week was no different. I’m working on my 2024 predictions and can’t wait to share where I think the world and AI is going next year. But before we get to that….
Here’s what you need to know about AI today:
Google Enters the Chat with Gemini
Is ChatGPT Getting Dumber?
Prediction: Personal AI Gets Local
#1 Google Enters the Chat with Gemini
Gif by travisband on Giphy
Google's throwing a new hat into the AI ring, with something called Gemini.
This new kid on the block could potentially outshine GPT-4, at least that is the current talk of the town.
What’s making Gemini so appealing is that it's acing 30 out of 32 academic benchmarks that measure how smart these AI models are.
What makes Gemini so special is its ability to juggle different types of input, it can understand text, code, images, and even audio.
Google's dream is to bake Gemini into all sorts of products, making our digital lives a breeze.
They're painting a picture of an AI that can devour huge chunks of data and spit out shiny pearls of wisdom.
The plan is to have three flavors of Gemini on the menu.
Gemini Ultra is the five-star chef for complex brain-teasers.
Gemini Pro is your versatile, go-to guy for a heap of different tasks.
And Gemini Nano is like that handy gadget you never knew you needed, ready to run on your phone.
According to Sundar Pichai this is one of the coolest things they've ever done.
They're not just building a smarter AI, they're kicking off a whole new era.
But as of now, it's only Gemini Pro that's stepping out in public, powering Google Bard.
The tech nerds will love this—Gemini is a whizz on Google's ultra-fast computers, the Tensor Processing Units.
It seems to sprint through tasks faster than you can say "artificial intelligence."
One thing's for sure, though, Gemini's also a champ at bolstering Google's own hype machine.
The big question is, will it live up to the fanfare?
💡 My take:
Finally. The saga that is AI has been interesting to watch especially considering how Google has been the favorite child in the tech world since, well… it’s inception. I never thought that I’d see the day where Google has been out innovated and then there was 2023. Google has been a leader in investing in AI with an investment of $400M alone in 2014 with their acquisition of DeepMind and $300M in Anthropic earlier this year (notably still behind Amazon with an investment in the same company of $1.25B).
OpenAI’s ascent means that larger companies like Google have been slow to catch up regardless of how much capital has been invested in the technology. The challenge becomes steering a large ship like Google or Amazon to produce innovative products at the speed of AI. I personally am excited to see Google enter the chat. I think strategically they spent far too much time trying to compete with Bard on the PR front and not enough time focusing on how LLMs (Bard and others) could integrate into their existing product offering. I understand why: Optics are everything and Google knows that LLM’s will replace search eventually. 2023 has been the year this user behavior has changed. But,Google does have us in a choke hold with Gmail and YouTube, lol…. such an easy way to integrate AI with a bang. Google’s biggest moat is the vast amount of data that they have on all of us. If you haven’t done a data request to see exactly how much data Google has on you it’s well worth the time. Any LLM is only as good as the data that it has and OpenAI is trained on publicly available data not the private data that Google has. This is where the real upside is.
#2 Is ChatGPT Getting Dumber?
I knew I wasn’t alone here! People are noticing ChatGPT it's turning down more prompts or returning lazy and simplified answers.
Rumors are swirling that it might be a cost-saving move—imagine trimming down a hefty $694K daily bill!
If ChatGPT is also giving you the cold shoulder here's what you can do to keep the conversation flowing:
Switch to ChatGPT Classic, the un-updated, original, and still very capable model.
Treat ChatGPT-4 with a touch of kindness. Politeness and empathy can go a long way.
If you're coding with ChatGPT's API, consider dialing down the 'temperature' to help it stay on track.
Patience is a virtue, especially when dealing with AI growing pains.
And who knows, your AI chat buddy might just need a little extra love today.
💡 My take:
I legitimately thought this was just me until I came across a subreddit on the topic. My take was that:
1.) They are processing a sh*t ton of data and need to cut down on compute costs by throttling output quality.
2.) They are about to launch another pricing tier that allows you to pay for a higher quality output.
Two things can be true 😄. Whatever it is you’re not alone and I wanted to include this in my email this week so that you would know IT AIN’T JUST YOU!
#3 Prediction: Personal AI Gets Local
Intel's CEO, Pat Gelsinger, lit up the stage at the company’s 2023 Innovation Summit with his vision of AI PCs that zoom on your laptop without a need for the internet.
Big tech names like Apple and Qualcomm are in a sprint, aiming to embed artificial intelligence straight into our devices.
According to Pallavi Mahajan from Intel, AI is already a big deal for edge computing, especially with natural language processing and computer vision.
Microsoft and Google are dishing out billions to amp up AI in the cloud, but there's a glitch - speed.
Professor Oliver Lemon shares that big language models can be sluggish for real-time chat, something he faced with his robotic assistant, Spring.
Spring turned to a smaller, zippier AI model called Vicuna-13B for the smooth chit-chat.
With a more modest AI model, you get speed and privacy, since your data doesn't need to travel the internet.
An app called Rewind shows off this local AI power, finding lost files or summarizing emails right on your PC. Definitely plan to see more of smaller and focused models in 2024.
Privacy is a huge win with local AI; you keep your data close without it blasting across the web.
Phil Solis from IDC says on-device AI is going to explode, especially with privacy in mind.
AI is stepping out of the cloud shadow, edging its way back onto our devices for speed and privacy.
Apps like Rewind and Siri going offline on Apple Watch are just the beginning.
This is all part of the AI coming home to roost, with local processing taking center stage.
💡 My take:
On device AI is one of the biggest trends to watch in 2024. This is where the likes of Apple, Amazon, and Google will shine simply by their sheer integration into our daily lives with their hardware (phones, speakers, thermostats, refrigerators, etc). It’s taking to Internet of Things (IoT) market and super charging it. If you missed my op-ed on Sunday I go into more detail about market shifts between IoT and AI there. Next year you’ll see a rise in on-device AI applications. Simply put the next great chatbot will run at lighting speed on your laptop PC—no Internet connection required. I love this from a privacy standpoint but we’ll need to revisit the whole data portability dilemma in order for this to truly shine.
💩Sh*ts & Giggles
I’m working on my 2024 Predictions so I’ll be taking a break from my Sunday essay this week so I can focus on giving you the goods on what to lookout for next year.
That’s all for today folks!
See you on the interwebs,
AB