We started looking at Mastra and found resources clarifying what a copilot means with a UX lens.
A copilot would live outside of the LLM conversation flow, or chats that we are building currently.
So, as an example MVP or spike, you fork Shapeshift and then use the chat to control it. basically plug the “agent side-project into what is the web search bar now” to see if it can control shapeshift’s swappers. Using web as a playground if we try this out since we can already feed data to the model.Theoretically it could solve a lot of our issues with this magic front-end RAG. Agent chat acts as a copilot (can chat directly with it and see the actions it takes - much like Cursor chats or the newton agent experience) but then the magic happens in the app itself (out of the sidebar/chat) and gets controlled by Copilot.
Dev thoughts: “we build the Glyph app as a regular app, have data and flows available the usual way.”
https://docs.copilotkit.ai/mastra/human-in-the-loop
https://docs.copilotkit.ai/mastra/agentic-chat-ui https://docs.copilotkit.ai/mastra/generative-ui/tool-based