
Read time: 2 minutes (Click underlined topics or headings for links.)
Hey Edge Family, happy Friday! You made it to the weekend, let's wind down the work week with some AI news.
Here’s The Breakdown:
💎 𝟛 Tools to Give You The Edge
🚨 𝟚 AI Updates: Meta’s newest AI & Your voice = 30 languages
💻 𝟙 Practical Use of AI: One click lesson plan
Let’s jump in!

💎 𝟛 Tools to Give You The Edge
Code LLaMA: Meta’s coding LLM get access here
Kaiber: AI video generator using your texts and images
Scribe: Use AI for your boring documentation process

🚨 Breaking AI News 🚨

Image credit: Meta AI
Yesterday, Meta announced an “open-sourced” AI model, Code Llama. A code-specialized version of text generator Llama 2 trained on billions of tokens of code.
Can generate, complete, and explain code in natural language across programming languages
Comes in 7B, 13B, and 34B parameter models to balance performance and speed
Specialized Python and instruction-tuned versions available
Can take up to 100,000 tokens of code as input for longer programs
Devs? Will you be trying this model or are you sticking to GPT-4?

Image credit: Github
Startup ElevenLabs announced their AI voice cloning model can now synthesize natural speech in 30 languages, dramatically expanding from 8 originally. The coolest part is, it can use your voice in 30 different languages!
It provides text-to-speech and custom voice cloning to mimic specific people. Users simply upload samples, which the AI analyzes to build a believable synthetic version.
ElevenLabs itself faced backlash when its platform was exploited to harass public figures last year. While implementing more safeguards, risks remain with such powerful generative audio. Visit Eleven labs here.
What do you think? Is AI voice cloning unethical and why?
🔎Extra Insights🔍
🙂 Hugging Face raises $235M from investors
AI startup Hugging Face raised a $235M Series D at a $4.5B valuation, doubling since May 2022. The round included Google, Amazon, Nvidia, and Salesforce, valuing Hugging Face at over 100x revenue amid massive appetite for AI. (Full article here)
🚫 Websites Block OpenAIs GPTBot
Dozens of major companies, including Amazon and The New York Times are blocking OpenAI's web crawler GPTBot from accessing their sites to feed data to ChatGPT. (Full article here)
☢️ Open Source AI Myth Exposed
But researchers say many "open" models have catches, as open licensing increases transparency and access. Though free to use, Llama 2 lacks an open source license restricting commercial use. (Full article here)

💻 Real Life Use Cases
Chat GPT Prompt: One Click Lesson Plan + Quiz Creator
This prompt is for any teachers or educators looking to create an easy lesson plan with a simple prompt.
🔏 Copy and paste the prompt below ⏬
Role: You are an expert Lesson Planner with 40 years of experience in teaching K-12.
#Subject:#Grade Level#Topic/Theme:#Duration:
Output 1: Create a detailed Lesson Plan for my students to learn the following.
Output 2: Create a 10 question multiple choice Quiz with a Key to the answers.
Output 3: Create a 3 minute lecture I can use to introduce the subject that entails rigor and relevance with examples of how it applies to real life.
Output 4: Create a list activities or group activities that can be assigned to students to further their understanding of the lesson.
Output 5: Create a homework assignment
Output 6: Provide a glossary of terms
Output 7: Create a 10 minute narration script for a Podcast that teaches this entire lesson in full.Try it out at snack prompts link below ⬇️
Daily Definition
Parameters
In the context of machine learning and AI, parameters are the internal variables that the model tweaks and optimizes during training to better predict outcomes based on input data. Think of them as the model's 'knobs' that get adjusted to improve accuracy. For instance, in a neural network, weights and biases are typical parameters. These parameters get updated iteratively as the model learns from the training data, helping the model generalize and make accurate predictions or decisions on new, unseen data. Their optimal values are what distinguish a well-trained model from an underperforming one. Like Meta’s 7 Billion to 34 Billion parameter models

🔮 AI Inspiration

How was today's newsletter
That’s all for today!
If you have something interesting or just want to reach out, don’t hesitate to DM me on Twitter @Brandoncarterss
Thanks for your time today. If it is your first time here, welcome. I hope you found value in today’s Edge edition. If you are returning, thank you. It means the world to me that you spend a few minutes of your day with me. If you have any ideas you’d like me to cover in the future, reply to this email.
Thanks for reading see you tomorrow,
Best,
Sign up or share The Edge with a friend here.
