Buckle up, we are about to embark on a journey through human-computer interaction. Remember the good old days of graphical interfaces and mice? Hold on to your keyboards because we're living in the age of the second revolution: natural language interaction.
Join me in this presentation where we'll unleash the power of natural language interaction by turning a boring application into a fantastical LLM-powered app. Wave goodbye to those buttons, filters, and navigation panels because we're about to rewrite the user interface rulebook! We'll demystify the process by starting from the basics— a quick recap' of LLM principles, followed by a first solution using the OpenAI API. We will then go on to use agents with the remarkable LangChain framework in a second solution.
By the end of this talk, you'll be armed with all the knowledge you need to build your own LLM-powered applications. You will also have a list of tricks, tips, and a good idea of the pitfalls related to working with LLMs.
Disclaimer: Side effects may include an irrepressible desire to chat with inanimate objects. Attend at your own risk!
Marie-Alice Blete
After a decade as a full-stack Java developer, Marie-Alice has spread her skills to Big Data topics and cloud related architectures. She is now working at Worldline as a Software Architect and Data engineer, in the Labs entity. She preaches engineering best practices to her fellow Data Scientists colleagues and is particularly interested in the performance and latency issues associated with the deployment of AI solutions. She is leading a program on AI-augmented developers.
She is also a Developer Advocate and enjoys sharing her knowledge and engaging with the community as a tech speaker, and recently as an author with the book "Developing Apps with GPT-4 and ChatGPT" published by O'Reilly.