llamaindex
Enumerations
- ClipEmbeddingModelType
- DeuceChatStrategy
- IndexStructType
- KeywordTableRetrieverMode
- MetadataMode
- NodeRelationship
- ObjectType
- OpenAIEmbeddingModelType
- SimilarityType
- SummaryRetrieverMode
- Tokenizers
- VectorStoreQueryMode
Classes
- Anthropic
- BaseDocumentStore
- BaseEmbedding
- BaseInMemoryKVStore
- BaseIndex
- BaseIndexStore
- BaseKVStore
- BaseNode
- CallbackManager
- ClipEmbedding
- CompactAndRefine
- CondenseQuestionChatEngine
- ContextChatEngine
- DefaultContextGenerator
- Document
- HTMLReader
- HistoryChatEngine
- ImageDocument
- ImageNode
- InMemoryFileSystem
- IndexDict
- IndexList
- IndexNode
- IndexStruct
- KeywordTable
- KeywordTableIndex
- KeywordTableLLMRetriever
- KeywordTableRAKERetriever
- KeywordTableSimpleRetriever
- LLMQuestionGenerator
- LlamaDeuce
- MarkdownReader
- MongoDBAtlasVectorSearch
- MultiModalEmbedding
- NotionReader
- OpenAI
- OpenAIEmbedding
- PDFReader
- PapaCSVReader
- Portkey
- PromptHelper
- Refine
- Response
- ResponseSynthesizer
- RetrieverQueryEngine
- SentenceSplitter
- SimilarityPostprocessor
- SimpleChatEngine
- SimpleChatHistory
- SimpleDirectoryReader
- SimpleDocumentStore
- SimpleIndexStore
- SimpleKVStore
- SimpleMongoReader
- SimpleNodeParser
- SimpleResponseBuilder
- SimpleVectorStore
- SubQuestionOutputParser
- SubQuestionQueryEngine
- SummaryChatHistory
- SummaryIndex
- SummaryIndexLLMRetriever
- SummaryIndexRetriever
- TextFileReader
- TextNode
- TreeSummarize
- VectorIndexRetriever
- VectorStoreIndex
Interfaces
- BaseIndexInit
- BaseNodePostprocessor
- BaseOutputParser
- BaseQueryEngine
- BaseQuestionGenerator
- BaseReader
- BaseRetriever
- BaseTool
- ChatEngine
- ChatHistory
- ChatMessage
- ChatResponse
- Context
- ContextGenerator
- DefaultStreamToken
- Event
- ExactMatchFilter
- GenericFileSystem
- LLM
- LLMMetadata
- MessageContentDetail
- MetadataFilters
- MetadataInfo
- NodeParser
- NodeWithScore
- QueryEngineTool
- RefDocInfo
- RelatedNodeInfo
- RetrievalCallbackResponse
- ServiceContext
- ServiceContextOptions
- StorageContext
- StreamCallbackResponse
- StructuredOutput
- SubQuestion
- ToolMetadata
- VectorStore
- VectorStoreInfo
- VectorStoreQuery
- VectorStoreQueryResult
- VectorStoreQuerySpec
- WalkableFileSystem
Type Aliases
AnthropicStreamToken
Ƭ AnthropicStreamToken: Object
Type declaration
Name | Type |
---|---|
completion | string |
log_id? | string |
model | string |
stop? | boolean |
stop_reason | string | undefined |
Defined in
packages/core/src/callbacks/CallbackManager.ts:42
ChoiceSelectPrompt
Ƭ ChoiceSelectPrompt: typeof defaultChoiceSelectPrompt
Defined in
packages/core/src/Prompt.ts:165
CompleteFileSystem
Ƭ CompleteFileSystem: GenericFileSystem
& WalkableFileSystem
Defined in
packages/core/src/storage/FileSystem.ts:49
CompletionResponse
Ƭ CompletionResponse: ChatResponse
Defined in
packages/core/src/llm/LLM.ts:51
CondenseQuestionPrompt
Ƭ CondenseQuestionPrompt: typeof defaultCondenseQuestionPrompt
Defined in
packages/core/src/Prompt.ts:346
ContextSystemPrompt
Ƭ ContextSystemPrompt: typeof defaultContextSystemPrompt
Defined in
packages/core/src/Prompt.ts:367
EventTag
Ƭ EventTag: "intermediate"
| "final"
Defined in
packages/core/src/callbacks/CallbackManager.ts:10
EventType
Ƭ EventType: "retrieve"
| "llmPredict"
| "wrapper"
Defined in
packages/core/src/callbacks/CallbackManager.ts:11
ImageNodeConstructorProps
Ƭ ImageNodeConstructorProps<T
>: Pick
<ImageNode
<T
>, "image"
| "id_"
> & Partial
<ImageNode
<T
>>
Type parameters
Name | Type |
---|---|
T | extends Metadata |
Defined in
ImageType
Ƭ ImageType: string
| Blob
| URL
Defined in
KeywordExtractPrompt
Ƭ KeywordExtractPrompt: typeof defaultKeywordExtractPrompt
Defined in
packages/core/src/Prompt.ts:382
ListIndex
Ƭ ListIndex: SummaryIndex
Defined in
packages/core/src/indices/summary/SummaryIndex.ts:264
ListIndexLLMRetriever
Ƭ ListIndexLLMRetriever: SummaryIndexLLMRetriever
Defined in
packages/core/src/indices/summary/SummaryIndexRetriever.ts:134
ListIndexRetriever
Ƭ ListIndexRetriever: SummaryIndexRetriever
Defined in
packages/core/src/indices/summary/SummaryIndexRetriever.ts:133
ListRetrieverMode
Ƭ ListRetrieverMode: SummaryRetrieverMode
Defined in
packages/core/src/indices/summary/SummaryIndex.ts:265
MessageContent
Ƭ MessageContent: string
| MessageContentDetail
[]
Extended type for the content of a message that allows for multi-modal messages.
Defined in
packages/core/src/ChatEngine.ts:350
MessageType
Ƭ MessageType: "user"
| "assistant"
| "system"
| "generic"
| "function"
| "memory"
Defined in
packages/core/src/llm/LLM.ts:31
Metadata
Ƭ Metadata: Record
<string
, any
>
Defined in
OpenAIStreamToken
Ƭ OpenAIStreamToken: DefaultStreamToken
Defined in
packages/core/src/callbacks/CallbackManager.ts:41
QueryKeywordExtractPrompt
Ƭ QueryKeywordExtractPrompt: typeof defaultQueryKeywordExtractPrompt
Defined in
packages/core/src/Prompt.ts:398
RefinePrompt
Ƭ RefinePrompt: typeof defaultRefinePrompt
Defined in
packages/core/src/Prompt.ts:106
RelatedNodeType
Ƭ RelatedNodeType<T
>: RelatedNodeInfo
<T
> | RelatedNodeInfo
<T
>[]
Type parameters
Name | Type |
---|---|
T | extends Metadata = Metadata |
Defined in
SimpleDirectoryReaderLoadDataProps
Ƭ SimpleDirectoryReaderLoadDataProps: Object
Type declaration
Name | Type |
---|---|
defaultReader? | BaseReader | null |
directoryPath | string |
fileExtToReader? | Record <string , BaseReader > |
fs? | CompleteFileSystem |
Defined in
packages/core/src/readers/SimpleDirectoryReader.ts:52
SimplePrompt
Ƭ SimplePrompt: (input
: Record
<string
, string
| undefined
>) => string
Type declaration
▸ (input
): string
A SimplePrompt is a function that takes a dictionary of inputs and returns a string. NOTE this is a different interface compared to LlamaIndex Python NOTE 2: we default to empty string to make it easy to calculate prompt sizes
Parameters
Name | Type |
---|---|
input | Record <string , string | undefined > |
Returns
string
Defined in
packages/core/src/Prompt.ts:10
SubQuestionPrompt
Ƭ SubQuestionPrompt: typeof defaultSubQuestionPrompt
Defined in
packages/core/src/Prompt.ts:314
SummaryPrompt
Ƭ SummaryPrompt: typeof defaultSummaryPrompt
Defined in
packages/core/src/Prompt.ts:73
TextQaPrompt
Ƭ TextQaPrompt: typeof defaultTextQaPrompt
Defined in
packages/core/src/Prompt.ts:37
TreeSummarizePrompt
Ƭ TreeSummarizePrompt: typeof defaultTreeSummarizePrompt
Defined in
packages/core/src/Prompt.ts:131
Variables
ALL_AVAILABLE_ANTHROPIC_MODELS
• Const
ALL_AVAILABLE_ANTHROPIC_MODELS: Object
Type declaration
Name | Type |
---|---|
claude-2 | { contextWindow : number = 200000 } |
claude-2.contextWindow | number |
claude-instant-1 | { contextWindow : number = 100000 } |
claude-instant-1.contextWindow | number |
Defined in
packages/core/src/llm/LLM.ts:640
ALL_AVAILABLE_LLAMADEUCE_MODELS
• Const
ALL_AVAILABLE_LLAMADEUCE_MODELS: Object
Type declaration
Name | Type |
---|---|
Llama-2-13b-chat-4bit | { contextWindow : number = 4096; replicateApi : string = "meta/llama-2-13b-chat:f4e2de70d66816a838a89eeeb621910adffb0dd0baba3976c96980970978018d" } |
Llama-2-13b-chat-4bit.contextWindow | number |
Llama-2-13b-chat-4bit.replicateApi | string |
Llama-2-13b-chat-old | { contextWindow : number = 4096; replicateApi : string = "a16z-infra/llama13b-v2-chat:df7690f1994d94e96ad9d568eac121aecf50684a0b0963b25a41cc40061269e5" } |
Llama-2-13b-chat-old.contextWindow | number |
Llama-2-13b-chat-old.replicateApi | string |
Llama-2-70b-chat-4bit | { contextWindow : number = 4096; replicateApi : string = "meta/llama-2-70b-chat:02e509c789964a7ea8736978a43525956ef40397be9033abf9fd2badfe68c9e3" } |
Llama-2-70b-chat-4bit.contextWindow | number |
Llama-2-70b-chat-4bit.replicateApi | string |
Llama-2-70b-chat-old | { contextWindow : number = 4096; replicateApi : string = "replicate/llama70b-v2-chat:e951f18578850b652510200860fc4ea62b3b16fac280f83ff32282f87bbd2e48" } |
Llama-2-70b-chat-old.contextWindow | number |
Llama-2-70b-chat-old.replicateApi | string |
Llama-2-7b-chat-4bit | { contextWindow : number = 4096; replicateApi : string = "meta/llama-2-7b-chat:13c3cdee13ee059ab779f0291d29054dab00a47dad8261375654de5540165fb0" } |
Llama-2-7b-chat-4bit.contextWindow | number |
Llama-2-7b-chat-4bit.replicateApi | string |
Llama-2-7b-chat-old | { contextWindow : number = 4096; replicateApi : string = "a16z-infra/llama7b-v2-chat:4f0a4744c7295c024a1de15e1a63c880d3da035fa1f49bfd344fe076074c8eea" } |
Llama-2-7b-chat-old.contextWindow | number |
Llama-2-7b-chat-old.replicateApi | string |
Defined in
packages/core/src/llm/LLM.ts:370
ALL_AVAILABLE_OPENAI_MODELS
• Const
ALL_AVAILABLE_OPENAI_MODELS: Object
We currently support GPT-3.5 and GPT-4 models
Type declaration
Name | Type |
---|---|
gpt-3.5-turbo | { contextWindow : number = 4096 } |
gpt-3.5-turbo.contextWindow | number |
gpt-3.5-turbo-1106 | { contextWindow : number = 16384 } |
gpt-3.5-turbo-1106.contextWindow | number |
gpt-3.5-turbo-16k | { contextWindow : number = 16384 } |
gpt-3.5-turbo-16k.contextWindow | number |
gpt-4 | { contextWindow : number = 8192 } |
gpt-4.contextWindow | number |
gpt-4-1106-preview | { contextWindow : number = 128000 } |
gpt-4-1106-preview.contextWindow | number |
gpt-4-32k | { contextWindow : number = 32768 } |
gpt-4-32k.contextWindow | number |
gpt-4-vision-preview | { contextWindow : number = 8192 } |
gpt-4-vision-preview.contextWindow | number |
Defined in
packages/core/src/llm/LLM.ts:119
DEFAULT_CHUNK_OVERLAP
• Const
DEFAULT_CHUNK_OVERLAP: 20
Defined in
packages/core/src/constants.ts:5
DEFAULT_CHUNK_OVERLAP_RATIO
• Const
DEFAULT_CHUNK_OVERLAP_RATIO: 0.1
Defined in
packages/core/src/constants.ts:6