Add Embedding Model (HF)
Richa Deshpande avatar
Written by Richa Deshpande
Updated over a week ago

You can use any embedding model to power your AI apps with MongoDB Atlas, including OpenAI, Cohere, and Bedrock.

Here we give an example with HuggingFace for local embeddings.

Create a HuggingFace Account and API Key: https://huggingface.co/settings/tokens

Store your token with your MongoDB Atlas URI and other environment keys. The best practice is to store them as environment variables.

Install Libraries:

pip install pymongo langchain langchain-mongodb sentence-transformers pypdf 

Code Sample:

from pymongo.mongo_client import MongoClient
from pymongo.server_api import ServerApi
from langchain_mongodb import MongoDBAtlasVectorSearch
from langchain_community.embeddings import HuggingFaceEmbeddings
from langchain.embeddings import OpenAIEmbeddings
from langchain.document_loaders import PyPDFLoader
from langchain.text_splitter import RecursiveCharacterTextSplitter
from langchain.prompts import PromptTemplate
from langchain.chains import RetrievalQA
from langchain.chat_models import ChatOpenAI
from langchain_community.llms import HuggingFaceEndpoint
import pprint

uri = <conn string>

# Create a new client and connect to the server
client = MongoClient(uri, server_api=ServerApi('1'))

# Initialize embeddings
HUGGINGFACE_TOKEN = <your access token>

embeddings = HuggingFaceEmbeddings(model_name="thenlper/gte-large")
Did this answer your question?