Skip to content

Commit

Permalink
🌿 Fern Regeneration -- November 6, 2024 (#226)
Browse files Browse the repository at this point in the history
Co-authored-by: fern-api <115122769+fern-api[bot]@users.noreply.github.com>
  • Loading branch information
fern-api[bot] authored Nov 6, 2024
1 parent ac89e41 commit 8c7c6a3
Show file tree
Hide file tree
Showing 19 changed files with 816 additions and 762 deletions.
21 changes: 21 additions & 0 deletions .mock/definition/empathic-voice/__package__.yml
Original file line number Diff line number Diff line change
Expand Up @@ -1070,8 +1070,12 @@ types:
enum:
- value: claude-3-5-sonnet-latest
name: Claude35SonnetLatest
- value: claude-3-5-haiku-latest
name: Claude35HaikuLatest
- value: claude-3-5-sonnet-20240620
name: Claude35Sonnet20240620
- value: claude-3-5-haiku-20241022
name: Claude35Haiku20241022
- value: claude-3-opus-20240229
name: Claude3Opus20240229
- value: claude-3-sonnet-20240229
Expand Down Expand Up @@ -3086,6 +3090,23 @@ types:
from a [User
Input](/reference/empathic-voice-interface-evi/chat/chat#send.User%20Input.text)
message.
interim:
type: boolean
docs: >-
Indicates if this message contains an immediate and unfinalized
transcript of the user’s audio input. If it does, words may be
repeated across successive `UserMessage` messages as our transcription
model becomes more confident about what was said with additional
context. Interim messages are useful to detect if the user is
interrupting during audio playback on the client. Even without a
finalized transcription, along with
[UserInterrupt](/reference/empathic-voice-interface-evi/chat/chat#receive.User%20Interruption.type)
messages, interim `UserMessages` are useful for detecting if the user
is interrupting during audio playback on the client, signaling to stop
playback in your application. Interim `UserMessages` will only be
received if the
[verbose_transcription](/reference/empathic-voice-interface-evi/chat/chat#request.query.verbose_transcription)
query parameter is set to `true` in the handshake request.
source:
openapi: assistant-asyncapi.json
JsonMessage:
Expand Down
10 changes: 10 additions & 0 deletions .mock/definition/empathic-voice/chat.yml
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,16 @@ channel:
Use the GET `/v0/evi/chat_groups` endpoint to obtain the Chat Group IDs
of all Chat Groups associated with an API key. This endpoint returns a
list of all available chat groups.
verbose_transcription:
type: optional<boolean>
docs: >-
A flag to enable verbose transcription. Set this query parameter to
`true` to have unfinalized user transcripts be sent to the client as
interim UserMessage messages. The
[interim](/reference/empathic-voice-interface-evi/chat/chat#receive.User%20Message.interim)
field on a
[UserMessage](/reference/empathic-voice-interface-evi/chat/chat#receive.User%20Message.type)
denotes whether the message is "interim" or "final."
access_token:
type: optional<string>
docs: >-
Expand Down
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "hume",
"version": "0.9.4",
"version": "0.9.5",
"private": false,
"repository": "https://github.com/HumeAI/hume-typescript-sdk",
"main": "./index.js",
Expand Down
Loading

0 comments on commit 8c7c6a3

Please sign in to comment.