-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
1 parent
2cc1a0a
commit a87081d
Showing
7 changed files
with
942 additions
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,182 @@ | ||
import { getCustomStaticPath } from "@/utils/getCustomStaticPath"; | ||
|
||
export const meta = { | ||
title: "Context", | ||
description: | ||
"How to pass client-side context to the LLM to help it respond.", | ||
platforms: [ | ||
"javascript", | ||
"react-native", | ||
"angular", | ||
"nextjs", | ||
"react", | ||
"vue", | ||
], | ||
}; | ||
|
||
export const getStaticPaths = async () => { | ||
return getCustomStaticPath(meta.platforms); | ||
}; | ||
|
||
export function getStaticProps(context) { | ||
return { | ||
props: { | ||
platform: context.params.platform, | ||
meta, | ||
showBreadcrumbs: false, | ||
}, | ||
}; | ||
} | ||
|
||
|
||
|
||
For LLMs to provide high-quality answers to users' questions, they need to have the right information. Sometimes this information is contextual, based on the user or the state of the application. To allow for this, you can send `aiContext` with any user message to the LLM, which can be any unstructured or structured data that might be useful. | ||
|
||
<InlineFilter filters={["javascript","vue","angular"]}> | ||
|
||
```ts | ||
import { generateClient } from "aws-amplify/data"; | ||
import type { Schema } from "../amplify/data/resource"; | ||
|
||
const client = generateClient<Schema>({ authMode: 'userPool' }); | ||
|
||
const { data: conversation } = await client.conversations.chat.create(); | ||
|
||
conversation.sendMessage({ | ||
content: [{ text: "hello" }], | ||
// aiContext can be any shape | ||
aiContext: { | ||
username: "danny" | ||
} | ||
}) | ||
``` | ||
|
||
</InlineFilter> | ||
|
||
|
||
<InlineFilter filters={["react-native"]}> | ||
|
||
```tsx | ||
export default function Chat() { | ||
const [ | ||
{ | ||
data: { messages }, | ||
isLoading, | ||
}, | ||
sendMessage, | ||
] = useAIConversation('chat'); | ||
|
||
function handleSendMessage(message) { | ||
sendMessage({ | ||
...message, | ||
// this can be any object that can be stringified | ||
aiContext: { | ||
currentTime: new Date().toLocaleTimeString() | ||
} | ||
}) | ||
} | ||
|
||
return ( | ||
//... | ||
) | ||
} | ||
``` | ||
|
||
</InlineFilter> | ||
|
||
|
||
<InlineFilter filters={["react", "nextjs"]}> | ||
|
||
```tsx | ||
function Chat() { | ||
const [ | ||
{ | ||
data: { messages }, | ||
isLoading, | ||
}, | ||
sendMessage, | ||
] = useAIConversation('chat'); | ||
|
||
return ( | ||
<AIConversation | ||
messages={messages} | ||
isLoading={isLoading} | ||
handleSendMessage={sendMessage} | ||
// This will let the LLM know about the current state of this application | ||
// so it can better respond to questions | ||
aiContext={() => { | ||
return { | ||
currentTime: new Date().toLocaleTimeString(), | ||
}; | ||
}} | ||
/> | ||
); | ||
} | ||
``` | ||
|
||
|
||
The function passed to the `aiContext` prop will be run immediately before the request is sent in order to get the most up to date information. | ||
|
||
You can use React context or other state management systems to update the data passed to `aiContext`. Using React context we can provide more information about the current state of the application: | ||
|
||
```tsx | ||
// Create a context to share state across components | ||
const DataContext = React.createContext<{ | ||
data: any; | ||
setData: (value: React.SetStateAction<any>) => void; | ||
}>({ data: {}, setData: () => {} }); | ||
|
||
// Create a component that updates the shared state | ||
function Counter() { | ||
const { data, setData } = React.useContext(AIContext); | ||
const count = data.count ?? 0; | ||
return ( | ||
<Button onClick={() => setData({ ...data, count: count + 1 })}> | ||
{count} | ||
</Button> | ||
); | ||
} | ||
|
||
// reference shared data in aiContext | ||
function Chat() { | ||
const { data } = React.useContext(DataContext); | ||
const [ | ||
{ | ||
data: { messages }, | ||
isLoading, | ||
}, | ||
sendMessage, | ||
] = useAIConversation('pirateChat'); | ||
|
||
return ( | ||
<AIConversation | ||
messages={messages} | ||
isLoading={isLoading} | ||
handleSendMessage={sendMessage} | ||
// This will let the LLM know about the current state of this application | ||
// so it can better respond to questions | ||
aiContext={() => { | ||
return { | ||
...data, | ||
currentTime: new Date().toLocaleTimeString(), | ||
}; | ||
}} | ||
/> | ||
); | ||
} | ||
|
||
export default function Example() { | ||
const [data, setData] = React.useState({}); | ||
return ( | ||
<Authenticator> | ||
<DataContext.Provider value={{ data, setData }}> | ||
<Counter /> | ||
<Chat /> | ||
</DataContext.Provider> | ||
</Authenticator> | ||
) | ||
} | ||
``` | ||
|
||
|
||
</InlineFilter> |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,95 @@ | ||
import { getCustomStaticPath } from "@/utils/getCustomStaticPath"; | ||
|
||
export const meta = { | ||
title: "Conversation History", | ||
description: | ||
"Learn how Amplify AI kit takes care of conversation history", | ||
platforms: [ | ||
"javascript", | ||
"react-native", | ||
"angular", | ||
"nextjs", | ||
"react", | ||
"vue", | ||
], | ||
}; | ||
|
||
export const getStaticPaths = async () => { | ||
return getCustomStaticPath(meta.platforms); | ||
}; | ||
|
||
export function getStaticProps(context) { | ||
return { | ||
props: { | ||
platform: context.params.platform, | ||
meta, | ||
showBreadcrumbs: false, | ||
}, | ||
}; | ||
} | ||
|
||
The Amplify AI kit automatically and securely stores conversation history per user so you can easily resume past conversations. | ||
|
||
<Callout> | ||
|
||
If you are looking for a quick way to get stared with conversation history, [this example project](https://github.com/aws-samples/amplify-ai-examples/tree/main/claude-ai) has a similar interface to ChatGPT or Claude where users see past conversations in a sidebar they can manage. | ||
|
||
</Callout> | ||
|
||
When you define a conversation route in your Amplify data schema, the Amplify AI kit turns that into 2 data models: `Conversation` and `Message`. The `Conversation` model functions mostly the same way as other data models defined in your schema. You can list and filter them (because they use owner-based authorization users will only see their conversations) and you can get a specific conversation by ID. Then once you have a conversation instance you can load the messages in it if there are any, send messages to it, and subscribe to the stream events being sent back. | ||
|
||
|
||
## Listing conversations | ||
|
||
To list all the conversations a user has you can use the `.list()` method. It works the same way as any other Amplify data model would. You can optionally pass a `limit` or `nextToken`. | ||
|
||
```ts | ||
const { data: conversations } = await client.conversations.chat.list() | ||
``` | ||
|
||
The conversation model has `createdAt` and `updatedAt` fields so you can sort by when the conversation was created or when it was last updated. The `updatedAt` field gets updated when new messages are sent, so you can use that to see which conversation had the most recent message. | ||
|
||
```ts | ||
const {data: conversations} = await client.conversations.chat.list(); | ||
|
||
conversations.sort((a, b) => (a.updatedAt > b.updatedAt ? -1 : 1)) | ||
``` | ||
|
||
Conversations also have `name` and `metadata` fields you can use to more easily find and resume past conversations. `name` is a string and `metadata` is a JSON object so you can store any extra information you need. | ||
|
||
## Resuming conversations | ||
|
||
You can resume a conversation by calling the `.get()` method with a conversation ID. Both `.create()` and `.get()` return the a conversation instance. | ||
|
||
<InlineFilter filters={['javascript','vue','angular']}> | ||
|
||
```ts | ||
// list all conversations a user has | ||
// make sure the user has been authenticated with Amplify Auth | ||
const conversationList = await client.conversations.conversation.list(); | ||
|
||
// Retrieve a specific conversation | ||
const { data: conversation } = await client.conversations.chat.get({ id: conversationList[0].id }); | ||
|
||
// list the existing messages in the conversation | ||
const { data: messages } = await conversation.listMessages(); | ||
|
||
// You can now send a message to the conversation | ||
conversation.sendMessage({}) | ||
``` | ||
|
||
</InlineFilter> | ||
|
||
<InlineFilter filters={['react','nextjs','react-native']}> | ||
|
||
```tsx | ||
export function Chat({ id }) { | ||
const [ | ||
data: { messages } | ||
handleSendMessage, | ||
] = useAIConversation('chat', { id }) | ||
} | ||
``` | ||
|
||
</InlineFilter> | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,79 @@ | ||
import { getChildPageNodes } from '@/utils/getChildPageNodes'; | ||
import { getCustomStaticPath } from "@/utils/getCustomStaticPath"; | ||
|
||
export const meta = { | ||
title: "Conversation", | ||
description: | ||
"Learn about conversational AI patterns and how to implement them in Amplify.", | ||
route: '/[platform]/ai/conversation', | ||
platforms: [ | ||
"javascript", | ||
"react-native", | ||
"angular", | ||
"nextjs", | ||
"react", | ||
"vue", | ||
], | ||
}; | ||
|
||
export const getStaticPaths = async () => { | ||
return getCustomStaticPath(meta.platforms); | ||
}; | ||
|
||
export function getStaticProps(context) { | ||
const childPageNodes = getChildPageNodes(meta.route); | ||
return { | ||
props: { | ||
meta, | ||
childPageNodes, | ||
showBreadcrumbs: false, | ||
} | ||
}; | ||
} | ||
|
||
|
||
The conversation route simplifies the creation of AI-powered conversation interfaces in your application. It automatically sets up the necessary AppSync API components and Lambda functions to handle streaming multi-turn interactions with Amazon Bedrock foundation models. | ||
|
||
## Key Components | ||
|
||
1. **AppSync API**: Gateway to the conversation route. | ||
- Create new conversation route instance. | ||
- Send messages to conversation route instance. | ||
- Subscribe to real-time updates for assistant responses. | ||
|
||
2. **Lambda Function**: Bridge between AppSync and Amazon Bedrock. | ||
- Retrieve conversation instance history. | ||
- Invokes Bedrock's /converse endpoint. | ||
- Handles tool use responses by invoking AppSync queries. | ||
|
||
3. **DynamoDB**: Stores conversation and message data | ||
- Conversations are scoped to a specific application user. | ||
|
||
## Authentication Flow | ||
|
||
1. The user's OIDC access token is passed from the client to AppSync | ||
2. AppSync forwards this token to the Lambda function | ||
3. The Lambda function uses the token to authenticate requests back to AppSync | ||
|
||
## Usage Scenarios | ||
|
||
Each of the following scenarios have safeguards in place to mitigate risks associated with invoking tools on behalf of the user, including: | ||
|
||
- Amazon CloudWatch log group redacting OIDC access tokens for logs from the Lambda function. | ||
- IAM policies that limit the Lambda function's ability to access other resources. | ||
|
||
|
||
## Data Flow | ||
|
||
1. User sends a message via the AppSync mutation | ||
2. AppSync triggers the Lambda function (default or custom) | ||
3. Lambda processes the message and invokes Bedrock's /converse endpoint | ||
a. If response is a tool use, Lambda function invokes applicable AppSync query. | ||
4. Lambda sends assistant response back to AppSync | ||
5. AppSync sends the response to subscribed clients | ||
|
||
This design allows for real-time, scalable conversations while ensuring that the Lambda function's data access matches that of the application user. | ||
|
||
## Next Steps | ||
|
||
<Overview childPageNodes={props.childPageNodes} /> |
Oops, something went wrong.