|
| 1 | +# Timbr |
| 2 | + |
| 3 | +[Timbr](https://docs.timbr.ai/doc/docs/integration/langchain-sdk/) integrates natural language inputs with Timbr's ontology-driven semantic layer. Leveraging Timbr's robust ontology capabilities, the SDK integrates with Timbr data models and leverages semantic relationships and annotations, enabling users to query data using business-friendly language. |
| 4 | + |
| 5 | +Timbr provides a pre-built SQL agent, `TimbrSqlAgent`, which can be used for end-to-end purposes from user prompt, through semantic SQL query generation and validation, to query execution and result analysis. |
| 6 | + |
| 7 | +For customizations and partial usage, you can use LangChain chains and LangGraph nodes with our 5 main tools: |
| 8 | + |
| 9 | +- `IdentifyTimbrConceptChain` & `IdentifyConceptNode` - Identify relevant concepts from user prompts |
| 10 | +- `GenerateTimbrSqlChain` & `GenerateTimbrSqlNode` - Generate SQL queries from natural language prompts |
| 11 | +- `ValidateTimbrSqlChain` & `ValidateSemanticSqlNode` - Validate SQL queries against Timbr knowledge graph schemas |
| 12 | +- `ExecuteTimbrQueryChain` & `ExecuteSemanticQueryNode` - Execute (semantic and regular) SQL queries against Timbr knowledge graph databases |
| 13 | +- `GenerateAnswerChain` & `GenerateResponseNode` - Generate human-readable answers based on a given prompt and data rows |
| 14 | + |
| 15 | +Additionally, `langchain-timbr` provides `TimbrLlmConnector` for manual integration with Timbr's semantic layer using LLM providers. This connector includes the following methods: |
| 16 | + |
| 17 | +- `get_ontologies` - List Timbr's semantic knowledge graphs |
| 18 | +- `get_concepts` - List selected knowledge graph ontology representation concepts |
| 19 | +- `get_views` - List selected knowledge graph ontology representation views |
| 20 | +- `determine_concept` - Identify relevant concepts from user prompts |
| 21 | +- `generate_sql` - Generate SQL queries from natural language prompts |
| 22 | +- `validate_sql` - Validate SQL queries against Timbr knowledge graph schemas |
| 23 | +- `run_timbr_query` - Execute (semantic and regular) SQL queries against Timbr knowledge graph databases |
| 24 | +- `run_llm_query` - Execute agent pipeline to determine concept, generate SQL, and run query from natural language prompt |
| 25 | + |
| 26 | +## Quickstart |
| 27 | + |
| 28 | +### Installation |
| 29 | + |
| 30 | +#### Install the package |
| 31 | + |
| 32 | +```bash |
| 33 | +pip install langchain-timbr |
| 34 | +``` |
| 35 | + |
| 36 | +#### Optional: Install with selected LLM provider |
| 37 | + |
| 38 | +Choose one of: openai, anthropic, google, azure_openai, snowflake, databricks (or 'all') |
| 39 | + |
| 40 | +```bash |
| 41 | +pip install 'langchain-timbr[<your selected providers, separated by comma without spaces>]' |
| 42 | +``` |
| 43 | + |
| 44 | +## Configuration |
| 45 | + |
| 46 | +Starting from `langchain-timbr` v2.0.0, all chains, agents, and nodes support optional environment-based configuration. You can set the following environment variables to provide default values and simplify setup for the provided tools: |
| 47 | + |
| 48 | +### Timbr Connection Parameters |
| 49 | + |
| 50 | +- **TIMBR_URL**: Default Timbr server URL |
| 51 | +- **TIMBR_TOKEN**: Default Timbr authentication token |
| 52 | +- **TIMBR_ONTOLOGY**: Default ontology/knowledge graph name |
| 53 | + |
| 54 | +When these environment variables are set, the corresponding parameters (`url`, `token`, `ontology`) become optional in all chain and agent constructors and will use the environment values as defaults. |
| 55 | + |
| 56 | +### LLM Configuration Parameters |
| 57 | + |
| 58 | +- **LLM_TYPE**: The type of LLM provider (one of langchain_timbr LlmTypes enum: 'openai-chat', 'anthropic-chat', 'chat-google-generative-ai', 'azure-openai-chat', 'snowflake-cortex', 'chat-databricks') |
| 59 | +- **LLM_API_KEY**: The API key for authenticating with the LLM provider |
| 60 | +- **LLM_MODEL**: The model name or deployment to use |
| 61 | +- **LLM_TEMPERATURE**: Temperature setting for the LLM |
| 62 | +- **LLM_ADDITIONAL_PARAMS**: Additional parameters as dict or JSON string |
| 63 | + |
| 64 | +When LLM environment variables are set, the `llm` parameter becomes optional and will use the `LlmWrapper` with environment configuration. |
| 65 | + |
| 66 | +Example environment setup: |
| 67 | + |
| 68 | +```bash |
| 69 | +# Timbr connection |
| 70 | +export TIMBR_URL="https://your-timbr-app.com/" |
| 71 | +export TIMBR_TOKEN="tk_XXXXXXXXXXXXXXXXXXXXXXXX" |
| 72 | +export TIMBR_ONTOLOGY="timbr_knowledge_graph" |
| 73 | + |
| 74 | +# LLM configuration |
| 75 | +export LLM_TYPE="openai-chat" |
| 76 | +export LLM_API_KEY="your-openai-api-key" |
| 77 | +export LLM_MODEL="gpt-4o" |
| 78 | +export LLM_TEMPERATURE="0.1" |
| 79 | +export LLM_ADDITIONAL_PARAMS='{"max_tokens": 1000}' |
| 80 | +``` |
| 81 | + |
| 82 | +## Usage |
| 83 | + |
| 84 | +Import and utilize your intended chain/node, or use TimbrLlmConnector to manually integrate with Timbr's semantic layer. For a complete agent working example, see the [Timbr tool page](/docs/integrations/tools/timbr). |
| 85 | + |
| 86 | +### ExecuteTimbrQueryChain example |
| 87 | + |
| 88 | +```python |
| 89 | +from langchain_timbr import ExecuteTimbrQueryChain |
| 90 | + |
| 91 | +# You can use the standard LangChain ChatOpenAI/ChatAnthropic models |
| 92 | +# or any other LLM model based on langchain_core.language_models.chat.BaseChatModel |
| 93 | +llm = ChatOpenAI(model="gpt-4o", temperature=0, openai_api_key='open-ai-api-key') |
| 94 | + |
| 95 | +# Optional alternative: Use Timbr's LlmWrapper, which provides generic connections to different LLM providers |
| 96 | +from langchain_timbr import LlmWrapper, LlmTypes |
| 97 | +llm = LlmWrapper(llm_type=LlmTypes.OpenAI, api_key="open-ai-api-key", model="gpt-4o") |
| 98 | + |
| 99 | +execute_timbr_query_chain = ExecuteTimbrQueryChain( |
| 100 | + llm=llm, |
| 101 | + url="https://your-timbr-app.com/", |
| 102 | + token="tk_XXXXXXXXXXXXXXXXXXXXXXXX", |
| 103 | + ontology="timbr_knowledge_graph", |
| 104 | + schema="dtimbr", # optional |
| 105 | + concept="Sales", # optional |
| 106 | + concepts_list=["Sales","Orders"], # optional |
| 107 | + views_list=["sales_view"], # optional |
| 108 | + note="We only need sums", # optional |
| 109 | + retries=3, # optional |
| 110 | + should_validate_sql=True # optional |
| 111 | +) |
| 112 | + |
| 113 | +result = execute_timbr_query_chain.invoke({"prompt": "What are the total sales for last month?"}) |
| 114 | +rows = result["rows"] |
| 115 | +sql = result["sql"] |
| 116 | +concept = result["concept"] |
| 117 | +schema = result["schema"] |
| 118 | +error = result.get("error", None) |
| 119 | + |
| 120 | +usage_metadata = result.get("execute_timbr_usage_metadata", {}) |
| 121 | +determine_concept_usage = usage_metadata.get('determine_concept', {}) |
| 122 | +generate_sql_usage = usage_metadata.get('generate_sql', {}) |
| 123 | +# Each usage_metadata item contains: |
| 124 | +# * 'approximate': Estimated token count calculated before invoking the LLM |
| 125 | +# * 'input_tokens'/'output_tokens'/'total_tokens'/etc.: Actual token usage metrics returned by the LLM |
| 126 | +``` |
| 127 | + |
| 128 | +### Multiple chains using SequentialChain example |
| 129 | + |
| 130 | +```python |
| 131 | +from langchain.chains import SequentialChain |
| 132 | +from langchain_timbr import ExecuteTimbrQueryChain, GenerateAnswerChain |
| 133 | +from langchain_openai import ChatOpenAI |
| 134 | + |
| 135 | +# You can use the standard LangChain ChatOpenAI/ChatAnthropic models |
| 136 | +# or any other LLM model based on langchain_core.language_models.chat.BaseChatModel |
| 137 | +llm = ChatOpenAI(model="gpt-4o", temperature=0, openai_api_key='open-ai-api-key') |
| 138 | + |
| 139 | +# Optional alternative: Use Timbr's LlmWrapper, which provides generic connections to different LLM providers |
| 140 | +from langchain_timbr import LlmWrapper, LlmTypes |
| 141 | +llm = LlmWrapper(llm_type=LlmTypes.OpenAI, api_key="open-ai-api-key", model="gpt-4o") |
| 142 | + |
| 143 | +execute_timbr_query_chain = ExecuteTimbrQueryChain( |
| 144 | + llm=llm, |
| 145 | + url='https://your-timbr-app.com/', |
| 146 | + token='tk_XXXXXXXXXXXXXXXXXXXXXXXX', |
| 147 | + ontology='timbr_knowledge_graph', |
| 148 | +) |
| 149 | + |
| 150 | +generate_answer_chain = GenerateAnswerChain( |
| 151 | + llm=llm, |
| 152 | + url='https://your-timbr-app.com/', |
| 153 | + token='tk_XXXXXXXXXXXXXXXXXXXXXXXX', |
| 154 | +) |
| 155 | + |
| 156 | +pipeline = SequentialChain( |
| 157 | + chains=[execute_timbr_query_chain, generate_answer_chain], |
| 158 | + input_variables=["prompt"], |
| 159 | + output_variables=["answer", "sql"] |
| 160 | +) |
| 161 | + |
| 162 | +result = pipeline.invoke({"prompt": "What are the total sales for last month?"}) |
| 163 | +``` |
| 164 | + |
| 165 | +## Additional Resources |
| 166 | + |
| 167 | +- [PyPI](https://pypi.org/project/langchain-timbr) |
| 168 | +- [GitHub](https://github.com/WPSemantix/langchain-timbr) |
| 169 | +- [LangChain Timbr Docs](https://docs.timbr.ai/doc/docs/integration/langchain-sdk/) |
| 170 | +- [LangGraph Timbr Docs](https://docs.timbr.ai/doc/docs/integration/langgraph-sdk) |
0 commit comments