Features

Build A Program

Pricing

Resources

Sign In

EmpytMenuItem

asd

Solutions

Case Studies

Between Two Joels: Giving AI Longterm Memory

Between Two Joels: Giving AI Longterm Memory

Watch the webinar on-demand

We’re back with our AI experts for another practical learning session

 

Understanding and leveraging AI’s context windows—aka “working memory”—is very much doable, and an essential part of upping your insights game. That’s why we’re back with another session of Between Two Joels to help you unlock AI’s full potential. Simply put, AI’s limitations can be a roadblock. You can’t just throw information at an AI engine and expect it to perform magic. Knowing the right way to structure and deliver data can be the difference between a somewhat useful insight and a game-changing revelation.

In this session, we’ll tackled the nuances of AI’s context windows and long-term memory.

 

Four key insights from the session:

AI Long-Term Memory Concept 

Giving AI long term memory means giving AI the ability to retain and recall information over extended periods, similar to how humans remember past experiences or learned knowledge. This idea challenges the traditional view of AI as merely reactive and short-sighted, instead suggesting that AI is evolving to handle tasks that require historical context and sustained understanding.

Human vs. AI Memory:

The two Joels discussed the comparison between human memory and what AI memory could look like. Human memory is divided into working memory and long-term memory. Working memory is the information we’re actively using, such as remembering a phone number just long enough to dial it. Long-term memory stores information we’re not currently thinking about but can retrieve when needed, like recalling childhood memories. The idea is to replicate this structure in AI, where it can hold onto critical pieces of information for future tasks, thus becoming more effective over time.

The challenges and opportunities:

Giving AI long-term memory also comes with technical and philosophical challenges. On the technical side, there’s the challenge of how to efficiently store and retrieve vast amounts of data without overwhelming the system. On the philosophical side, there’s the question of what it means for AI to remember and how this could affect human-AI interactions. However, the opportunities are vast, including more personalized and context-aware AI applications, better decision-making, and improved human-AI collaboration in complex environments.

Future of AI Memory:

Looking ahead, the discussion speculated on the future implications of AI with long-term memory. Such advancements could lead to AI systems that not only learn from past interactions but also anticipate future needs, making them more proactive and less reliant on continuous human input. This could revolutionize industries like healthcare, finance, and customer service, where long-term contextual understanding could greatly enhance the quality and efficiency of services.

Joel Anderson

Joel Anderson, EVP, Advanced Analytics, Dig Insights

Joel brings decades of experience in the market research and analytics industry to Dig Insights. In 2023, Joel completed his Master’s in Applied Artificial Intelligence from the University of San Diego. Joel’s team leads advanced analytics, innovation R&D, and data science for our client services work.

Ian Ash

Ian Ash, Co-Founder, President of Upsiide, Dig Insights

Ian has over 20 years of experience leading large scale and international market research projects across several different industries. He’s currently President of Upsiide, Dig’s SaaS platform purpose-built for innovation insights.

Joel Armstrong

Joel Armstrong, Director, AI, Dig Insights

Joel is our in-house AI expert, having completed both a Master’s in Applied Artificial Intelligence from the University of San Diego and a PhD in Psychology at Western University. Joel helps lead our R&D as it relates to AI and AI best practices.

Related Posts