-
Here is my code:
```
import typing as t
import asyncio
from typing import List
from datasets import load_dataset, load_from_disk
from ragas.metrics import faithfulness, context_recall, context_p…
-
I'm having trouble running the function-calling example located here:
https://cloud.google.com/vertex-ai/generative-ai/docs/model-reference/function-calling#python_2
#### Environment details
- …
-
### Describe the bug
The SDK supports passing in langfuse data alongside openai (and other integrations I assume): https://langfuse.com/docs/integrations/openai/python/get-started#custom-trace-proper…
-
From the [Specifications for the Digital Talking Book](http://www.daisy.org/z3986/2005/Z3986-2005.html):
> dc:Date
> Date of publication of the DTB. (Compare dtb:sourceDate and dtb:producedDate.)
…
-
### Problem:
**400 Error message** With Vertex BatchPrediction API with _gemini-1.5-flash-001_ and _gemini-1.5-pro-001_
I am getting similar issues using Batch Prediction with gemini-1.5-flash-001…
-
### Checked other resources
- [X] I added a very descriptive title to this issue.
- [X] I searched the LangChain documentation with the integrated search.
- [X] I used the GitHub search to find a sim…
-
https://github.com/microsoft/z3guide/blob/d70efa36af36ec5c24384a2108a8378af4c62584/website/sidebars/smtlibSidebars.js#L17
Uncomment this line and remove the rest of the code.
-
Hi guys! I've been working on a new project:
https://github.com/threestudio-project/threestudio
threestudio is a unified framework for 3D content generation from text prompts, single images and fe…
-
I am a noob. Here is my code, how can I modify it to do batch inferring?
---
def load_model():
model_id = 'llama3/Meta-Llama-3-70B-Instruct'
pipeline = transformers.pipeline(
"t…
-
- Add information in the metadata for each chunk mentioning what type of content (Lecture Slide, HW, logistics, etc) it holds
- Based on this the LLM text generation is handled differently, by specif…