Read time: 3 minutes (Click underlined topics or headings for links.)

Hey AI Family, happy Thursday! Open AI files for the trademark of GPT-5!

Here’s The Breakdown:

  • 💎 𝟛 Tools to Give You The Edge

  • 🚨 𝟚 AI Updates: GPT-5 & AI investments to hit $200B

  • 💻 𝟙 Practical Use of AI: Design a website

Let’s jump in!

💎 𝟛 Tools to Give You The Edge

Course Factory: Streamline online course creation

Opus Clips: Transform long form videos into short form videos with AI

Hoops: AI powered platform for personalized stock market insights

🚨 Breaking AI News

OpenAI has filed a trademark application for "GPT-5", according to multiple Tweets, they are reportedly working on the next iteration of their leading AI language model. The filing covers official features:

  • Artificial production of human speech and text

  • Audio to text conversion

  • Voice and speech recognition

  • Machine-learning based language and speech processing

  • Translation of text or speech from one language to another

  • Creating and generating text

  • Developing and implementing artificial neural networks

Most notably, the filing highlights audio-to-text conversion and developing and implementing artificial neural networks! These features go way beyond GPT-4's focus on only text generation.

There is no confirmed date for the release of GPT-5, but hopefully it will be soon!

Global investment in AI could reach $200 billion by 2025, according to a new Goldman Sachs report. They estimate generative AI could increase global productivity by over 1% per year once widely adopted.

This huge growth is driven by the massive economic potential of generative AI models like GPT-4, PaLM-2, and Stable Diffusion. Goldman Sachs has reported a surge in corporate interest in AI, with 16% of US firms now mentioning it on earnings calls.

Nvidia in particular is up over 200% YTD amidst this AI hype. But concerns still remain around AI's impact on jobs, privacy, and bias. If challenges are addressed responsibly, AI investments could transform the economy on the scale of past tech boom.

Poll Time! Is the AI bubble bigger than the .com bubble?

Login or Subscribe to participate

🔎Extra Insights🔍

📈 AI-driven ETFs Outperforming Benchmarks

ETFs using AI for stock selection are growing, but questions remain if they offer an advantage. While computers can be smarter than humans by removing emotion, it's uncertain if they'll outperform benchmarks. (Full article here)

🗣 Chatbots Cant Stop Lying

AI chatbots like ChatGPT frequently make up facts, a core issue as companies explore new uses. Experts say this core flaw is unfixable, but tech leaders remain optimistic that fixes are 1-2 years away. Despite heavy investments, making chatbots reliably tell the truth faces big obstacles and may prove impossible. (Full article here)

🏰 White Castle to add AI Drive-Thrus

White Castle will add AI drive-thru voices to 100+ locations by 2024, hoping for faster service and less arguing over botched orders. Using SoundHound speech recognition, orders should take just over a minute. Reactions are mixed. Wendy's and Google also test AI drive-thrus, with human monitoring. (Full article here)

💻 Real Life Use Cases

Midjourney Prompt: Design a website landing page

Use Midjourney to design a beautiful website try this prompt!

🔏 Copy and paste the prompt below (Credit)
Landing Page for [subject website], header, hero, cta, UX, UI, full page, beautiful website, usability, high resolution, CSS design award
Here are the results ⬇️

Real estate

Daily Definition

Parameters

'Parameters' in the context of AI and machine learning refer to the internal variables of a model that the algorithm learns from the training data. These are the 'knobs' or 'dials' that the model adjusts during learning to make better predictions or classifications. In a simple linear regression model, for example, the parameters are the slope and the intercept that the model will try to find the best values for, based on the data it's been given.

Parameters are a core part of what makes a model predictive. After training, a model can use these learned parameters to apply what it has learned to new, unseen data. For example, A model with many parameters may have a higher capacity for learning complex patterns, but it also runs a greater risk of overfitting, where it performs well on training data but poorly on unseen data. GPT-4 is trained on over 100 trillion parameters!

🔮 AI Inspiration

Poll time! Real or AI generated picture

Login or Subscribe to participate

That’s all for today!

If you have something interesting or just want to reach out, don’t hesitate to DM me on Twitter @Brandoncarterss

Thanks for your time today. If it is your first time here, welcome. I hope you found value in today’s Edge edition. If you are returning, thank you. It means the world to me that you spend a few minutes of your day with me. If you have any ideas you’d like me to cover in the future, reply to this email.

Thanks for reading see you tomorrow,

Best,

Sign up or share The Edge with a friend here.

Reply

Avatar

or to participate

Keep Reading