","sha":"ec546ddbda3dfa0f05879457497902e00f1590bf","created_at":"2025-05-20 21:15","time_from_now":"2个月前","created_at_unix":1747746939}}]},"projectMenu":[{"menu_name":"home"},{"menu_name":"code"},{"menu_name":"issues"},{"menu_name":"devops"},{"menu_name":"versions"},{"menu_name":"wiki"},{"menu_name":"resources"},{"menu_name":"activity"}],"projectReadMe":"%7B%22type%22%3A%22file%22%2C%22encoding%22%3A%22base64%22%2C%22size%22%3A8507%2C%22name%22%3A%22README.md%22%2C%22path%22%3A%22README.md%22%2C%22content%22%3A%22%3Cp%20align%3D%5C%22center%5C%22%3E%5Cn%20%20%3Cbr%2F%3E%5Cn%20%20%3Cpicture%3E%5Cn%20%20%20%20%3Csource%20media%3D%5C%22(prefers-color-scheme%3A%20dark)%5C%22%20srcset%3D%5C%22https%3A%2F%2Fhuggingface.co%2Fdatasets%2Fhuggingface%2Fdocumentation-images%2Fraw%2Fmain%2Fhuggingfacejs-dark.svg%5C%22%3E%5Cn%20%20%20%20%3Csource%20media%3D%5C%22(prefers-color-scheme%3A%20light)%5C%22%20srcset%3D%5C%22https%3A%2F%2Fhuggingface.co%2Fdatasets%2Fhuggingface%2Fdocumentation-images%2Fraw%2Fmain%2Fhuggingfacejs-light.svg%5C%22%3E%5Cn%20%20%20%20%3Cimg%20alt%3D%5C%22huggingface%20javascript%20library%20logo%5C%22%20src%3D%5C%22https%3A%2F%2Fhuggingface.co%2Fdatasets%2Fhuggingface%2Fdocumentation-images%2Fraw%2Fmain%2Fhuggingfacejs-light.svg%5C%22%20width%3D%5C%22376%5C%22%20height%3D%5C%2259%5C%22%20style%3D%5C%22max-width%3A%20100%25%3B%5C%22%3E%5Cn%20%20%3C%2Fpicture%3E%5Cn%20%20%3Cbr%2F%3E%5Cn%20%20%3Cbr%2F%3E%5Cn%3C%2Fp%3E%5Cn%5Cn%60%60%60ts%5Cn%2F%2F%20Programmatically%20interact%20with%20the%20Hub%5Cn%5Cnawait%20createRepo(%7B%5Cn%20%20repo%3A%20%7B%20type%3A%20%5C%22model%5C%22%2C%20name%3A%20%5C%22my-user%2Fnlp-model%5C%22%20%7D%2C%5Cn%20%20accessToken%3A%20HF_TOKEN%5Cn%7D)%3B%5Cn%5Cnawait%20uploadFile(%7B%5Cn%20%20repo%3A%20%5C%22my-user%2Fnlp-model%5C%22%2C%5Cn%20%20accessToken%3A%20HF_TOKEN%2C%5Cn%20%20%2F%2F%20Can%20work%20with%20native%20File%20in%20browsers%5Cn%20%20file%3A%20%7B%5Cn%20%20%20%20path%3A%20%5C%22pytorch_model.bin%5C%22%2C%5Cn%20%20%20%20content%3A%20new%20Blob(...)%5Cn%20%20%7D%5Cn%7D)%3B%5Cn%5Cn%2F%2F%20Use%20all%20supported%20Inference%20Providers!%5Cn%5Cnawait%20inference.chatCompletion(%7B%5Cn%20%20model%3A%20%5C%22meta-llama%2FLlama-3.1-8B-Instruct%5C%22%2C%5Cn%20%20provider%3A%20%5C%22sambanova%5C%22%2C%20%2F%2F%20or%20together%2C%20fal-ai%2C%20replicate%2C%20cohere%20%E2%80%A6%5Cn%20%20messages%3A%20%5B%5Cn%20%20%20%20%7B%5Cn%20%20%20%20%20%20role%3A%20%5C%22user%5C%22%2C%5Cn%20%20%20%20%20%20content%3A%20%5C%22Hello%2C%20nice%20to%20meet%20you!%5C%22%2C%5Cn%20%20%20%20%7D%2C%5Cn%20%20%5D%2C%5Cn%20%20max_tokens%3A%20512%2C%5Cn%20%20temperature%3A%200.5%2C%5Cn%7D)%3B%5Cn%5Cnawait%20inference.textToImage(%7B%5Cn%20%20model%3A%20%5C%22black-forest-labs%2FFLUX.1-dev%5C%22%2C%5Cn%20%20provider%3A%20%5C%22replicate%5C%22%2C%5Cn%20%20inputs%3A%20%5C%22a%20picture%20of%20a%20green%20bird%5C%22%2C%5Cn%7D)%3B%5Cn%5Cn%2F%2F%20and%20much%20more%E2%80%A6%5Cn%60%60%60%5Cn%5Cn%23%20Hugging%20Face%20JS%20libraries%5Cn%5CnThis%20is%20a%20collection%20of%20JS%20libraries%20to%20interact%20with%20the%20Hugging%20Face%20API%2C%20with%20TS%20types%20included.%5Cn%5Cn-%20%5B%40huggingface%2Finference%5D(packages%2Finference%2FREADME.md)%3A%20Use%20all%20supported%20(serverless)%20Inference%20Providers%20or%20switch%20to%20Inference%20Endpoints%20(dedicated)%20to%20make%20calls%20to%20100%2C000%2B%20Machine%20Learning%20models%5Cn-%20%5B%40huggingface%2Fhub%5D(packages%2Fhub%2FREADME.md)%3A%20Interact%20with%20huggingface.co%20to%20create%20or%20delete%20repos%20and%20commit%20%2F%20download%20files%5Cn-%20%5B%40huggingface%2Fmcp-client%5D(packages%2Fmcp-client%2FREADME.md)%3A%20A%20Model%20Context%20Protocol%20(MCP)%20client%2C%20and%20a%20tiny%20Agent%20library%2C%20built%20on%20top%20of%20InferenceClient.%5Cn-%20%5B%40huggingface%2Fgguf%5D(packages%2Fgguf%2FREADME.md)%3A%20A%20GGUF%20parser%20that%20works%20on%20remotely%20hosted%20files.%5Cn-%20%5B%40huggingface%2Fdduf%5D(packages%2Fdduf%2FREADME.md)%3A%20Similar%20package%20for%20DDUF%20(DDUF%20Diffusers%20Unified%20Format)%5Cn-%20%5B%40huggingface%2Ftasks%5D(packages%2Ftasks%2FREADME.md)%3A%20The%20definition%20files%20and%20source-of-truth%20for%20the%20Hub's%20main%20primitives%20like%20pipeline%20tasks%2C%20model%20libraries%2C%20etc.%5Cn-%20%5B%40huggingface%2Fjinja%5D(packages%2Fjinja%2FREADME.md)%3A%20A%20minimalistic%20JS%20implementation%20of%20the%20Jinja%20templating%20engine%2C%20to%20be%20used%20for%20ML%20chat%20templates.%5Cn-%20%5B%40huggingface%2Fspace-header%5D(packages%2Fspace-header%2FREADME.md)%3A%20Use%20the%20Space%20%60mini_header%60%20outside%20Hugging%20Face%5Cn-%20%5B%40huggingface%2Follama-utils%5D(packages%2Follama-utils%2FREADME.md)%3A%20Various%20utilities%20for%20maintaining%20Ollama%20compatibility%20with%20models%20on%20the%20Hugging%20Face%20Hub.%5Cn-%20%5B%40huggingface%2Ftiny-agents%5D(packages%2Ftiny-agents%2FREADME.md)%3A%20A%20tiny%2C%20model-agnostic%20library%20for%20building%20AI%20agents%20that%20can%20use%20tools.%5Cn%5Cn%5CnWe%20use%20modern%20features%20to%20avoid%20polyfills%20and%20dependencies%2C%20so%20the%20libraries%20will%20only%20work%20on%20modern%20browsers%20%2F%20Node.js%20%3E%3D%2018%20%2F%20Bun%20%2F%20Deno.%5Cn%5CnThe%20libraries%20are%20still%20very%20young%2C%20please%20help%20us%20by%20opening%20issues!%5Cn%5Cn%23%23%20Installation%5Cn%5Cn%23%23%23%20From%20NPM%5Cn%5CnTo%20install%20via%20NPM%2C%20you%20can%20download%20the%20libraries%20as%20needed%3A%5Cn%5Cn%60%60%60bash%5Cnnpm%20install%20%40huggingface%2Finference%5Cnnpm%20install%20%40huggingface%2Fhub%5Cnnpm%20install%20%40huggingface%2Fmcp-client%5Cn%60%60%60%5Cn%5CnThen%20import%20the%20libraries%20in%20your%20code%3A%5Cn%5Cn%60%60%60ts%5Cnimport%20%7B%20InferenceClient%20%7D%20from%20%5C%22%40huggingface%2Finference%5C%22%3B%5Cnimport%20%7B%20createRepo%2C%20commit%2C%20deleteRepo%2C%20listFiles%20%7D%20from%20%5C%22%40huggingface%2Fhub%5C%22%3B%5Cnimport%20%7B%20McpClient%20%7D%20from%20%5C%22%40huggingface%2Fmcp-client%5C%22%3B%5Cnimport%20type%20%7B%20RepoId%20%7D%20from%20%5C%22%40huggingface%2Fhub%5C%22%3B%5Cn%60%60%60%5Cn%5Cn%23%23%23%20From%20CDN%20or%20Static%20hosting%5Cn%5CnYou%20can%20run%20our%20packages%20with%20vanilla%20JS%2C%20without%20any%20bundler%2C%20by%20using%20a%20CDN%20or%20static%20hosting.%20Using%20%5BES%20modules%5D(https%3A%2F%2Fhacks.mozilla.org%2F2018%2F03%2Fes-modules-a-cartoon-deep-dive%2F)%2C%20i.e.%20%60%3Cscript%20type%3D%5C%22module%5C%22%3E%60%2C%20you%20can%20import%20the%20libraries%20in%20your%20code%3A%5Cn%5Cn%60%60%60html%5Cn%3Cscript%20type%3D%5C%22module%5C%22%3E%5Cn%20%20%20%20import%20%7B%20InferenceClient%20%7D%20from%20'https%3A%2F%2Fcdn.jsdelivr.net%2Fnpm%2F%40huggingface%2Finference%404.3.2%2F%2Besm'%3B%5Cn%20%20%20%20import%20%7B%20createRepo%2C%20commit%2C%20deleteRepo%2C%20listFiles%20%7D%20from%20%5C%22https%3A%2F%2Fcdn.jsdelivr.net%2Fnpm%2F%40huggingface%2Fhub%402.3.0%2F%2Besm%5C%22%3B%5Cn%3C%2Fscript%3E%5Cn%60%60%60%5Cn%5Cn%23%23%23%20Deno%5Cn%5Cn%60%60%60ts%5Cn%2F%2F%20esm.sh%5Cnimport%20%7B%20InferenceClient%20%7D%20from%20%5C%22https%3A%2F%2Fesm.sh%2F%40huggingface%2Finference%5C%22%5Cn%5Cnimport%20%7B%20createRepo%2C%20commit%2C%20deleteRepo%2C%20listFiles%20%7D%20from%20%5C%22https%3A%2F%2Fesm.sh%2F%40huggingface%2Fhub%5C%22%5Cn%2F%2F%20or%20npm%3A%5Cnimport%20%7B%20InferenceClient%20%7D%20from%20%5C%22npm%3A%40huggingface%2Finference%5C%22%5Cn%5Cnimport%20%7B%20createRepo%2C%20commit%2C%20deleteRepo%2C%20listFiles%20%7D%20from%20%5C%22npm%3A%40huggingface%2Fhub%5C%22%5Cn%60%60%60%5Cn%5Cn%23%23%20Usage%20examples%5Cn%5CnGet%20your%20HF%20access%20token%20in%20your%20%5Baccount%20settings%5D(https%3A%2F%2Fhuggingface.co%2Fsettings%2Ftokens).%5Cn%5Cn%23%23%23%20%40huggingface%2Finference%20examples%5Cn%5Cn%60%60%60ts%5Cnimport%20%7B%20InferenceClient%20%7D%20from%20%5C%22%40huggingface%2Finference%5C%22%3B%5Cn%5Cnconst%20HF_TOKEN%20%3D%20%5C%22hf_...%5C%22%3B%5Cn%5Cnconst%20client%20%3D%20new%20InferenceClient(HF_TOKEN)%3B%5Cn%5Cn%2F%2F%20Chat%20completion%20API%5Cnconst%20out%20%3D%20await%20client.chatCompletion(%7B%5Cn%20%20model%3A%20%5C%22meta-llama%2FLlama-3.1-8B-Instruct%5C%22%2C%5Cn%20%20messages%3A%20%5B%7B%20role%3A%20%5C%22user%5C%22%2C%20content%3A%20%5C%22Hello%2C%20nice%20to%20meet%20you!%5C%22%20%7D%5D%2C%5Cn%20%20max_tokens%3A%20512%5Cn%7D)%3B%5Cnconsole.log(out.choices%5B0%5D.message)%3B%5Cn%5Cn%2F%2F%20Streaming%20chat%20completion%20API%5Cnfor%20await%20(const%20chunk%20of%20client.chatCompletionStream(%7B%5Cn%20%20model%3A%20%5C%22meta-llama%2FLlama-3.1-8B-Instruct%5C%22%2C%5Cn%20%20messages%3A%20%5B%7B%20role%3A%20%5C%22user%5C%22%2C%20content%3A%20%5C%22Hello%2C%20nice%20to%20meet%20you!%5C%22%20%7D%5D%2C%5Cn%20%20max_tokens%3A%20512%5Cn%7D))%20%7B%5Cn%20%20console.log(chunk.choices%5B0%5D.delta.content)%3B%5Cn%7D%5Cn%5Cn%2F%2F%2F%20Using%20a%20third-party%20provider%3A%5Cnawait%20client.chatCompletion(%7B%5Cn%20%20model%3A%20%5C%22meta-llama%2FLlama-3.1-8B-Instruct%5C%22%2C%5Cn%20%20messages%3A%20%5B%7B%20role%3A%20%5C%22user%5C%22%2C%20content%3A%20%5C%22Hello%2C%20nice%20to%20meet%20you!%5C%22%20%7D%5D%2C%5Cn%20%20max_tokens%3A%20512%2C%5Cn%20%20provider%3A%20%5C%22sambanova%5C%22%2C%20%2F%2F%20or%20together%2C%20fal-ai%2C%20replicate%2C%20cohere%20%E2%80%A6%5Cn%7D)%5Cn%5Cnawait%20client.textToImage(%7B%5Cn%20%20model%3A%20%5C%22black-forest-labs%2FFLUX.1-dev%5C%22%2C%5Cn%20%20inputs%3A%20%5C%22a%20picture%20of%20a%20green%20bird%5C%22%2C%5Cn%20%20provider%3A%20%5C%22fal-ai%5C%22%2C%5Cn%7D)%5Cn%5Cn%5Cn%5Cn%2F%2F%20You%20can%20also%20omit%20%5C%22model%5C%22%20to%20use%20the%20recommended%20model%20for%20the%20task%5Cnawait%20client.translation(%7B%5Cn%20%20inputs%3A%20%5C%22My%20name%20is%20Wolfgang%20and%20I%20live%20in%20Amsterdam%5C%22%2C%5Cn%20%20parameters%3A%20%7B%5Cn%20%20%20%20src_lang%3A%20%5C%22en%5C%22%2C%5Cn%20%20%20%20tgt_lang%3A%20%5C%22fr%5C%22%2C%5Cn%20%20%7D%2C%5Cn%7D)%3B%5Cn%5Cn%2F%2F%20pass%20multimodal%20files%20or%20URLs%20as%20inputs%5Cnawait%20client.imageToText(%7B%5Cn%20%20model%3A%20'nlpconnect%2Fvit-gpt2-image-captioning'%2C%5Cn%20%20data%3A%20await%20(await%20fetch('https%3A%2F%2Fpicsum.photos%2F300%2F300')).blob()%2C%5Cn%7D)%5Cn%5Cn%2F%2F%20Using%20your%20own%20dedicated%20inference%20endpoint%3A%20https%3A%2F%2Fhf.co%2Fdocs%2Finference-endpoints%2F%5Cnconst%20gpt2Client%20%3D%20client.endpoint('https%3A%2F%2Fxyz.eu-west-1.aws.endpoints.huggingface.cloud%2Fgpt2')%3B%5Cnconst%20%7B%20generated_text%20%7D%20%3D%20await%20gpt2Client.textGeneration(%7B%20inputs%3A%20'The%20answer%20to%20the%20universe%20is'%20%7D)%3B%5Cn%5Cn%2F%2F%20Chat%20Completion%5Cnconst%20llamaEndpoint%20%3D%20client.endpoint(%5Cn%20%20%5C%22https%3A%2F%2Frouter.huggingface.co%2Fhf-inference%2Fmodels%2Fmeta-llama%2FLlama-3.1-8B-Instruct%5C%22%5Cn)%3B%5Cnconst%20out%20%3D%20await%20llamaEndpoint.chatCompletion(%7B%5Cn%20%20model%3A%20%5C%22meta-llama%2FLlama-3.1-8B-Instruct%5C%22%2C%5Cn%20%20messages%3A%20%5B%7B%20role%3A%20%5C%22user%5C%22%2C%20content%3A%20%5C%22Hello%2C%20nice%20to%20meet%20you!%5C%22%20%7D%5D%2C%5Cn%20%20max_tokens%3A%20512%2C%5Cn%7D)%3B%5Cnconsole.log(out.choices%5B0%5D.message)%3B%5Cn%60%60%60%5Cn%5Cn%23%23%23%20%40huggingface%2Fhub%20examples%5Cn%5Cn%60%60%60ts%5Cnimport%20%7B%20createRepo%2C%20uploadFile%2C%20deleteFiles%20%7D%20from%20%5C%22%40huggingface%2Fhub%5C%22%3B%5Cn%5Cnconst%20HF_TOKEN%20%3D%20%5C%22hf_...%5C%22%3B%5Cn%5Cnawait%20createRepo(%7B%5Cn%20%20repo%3A%20%5C%22my-user%2Fnlp-model%5C%22%2C%20%2F%2F%20or%20%7B%20type%3A%20%5C%22model%5C%22%2C%20name%3A%20%5C%22my-user%2Fnlp-test%5C%22%20%7D%2C%5Cn%20%20accessToken%3A%20HF_TOKEN%5Cn%7D)%3B%5Cn%5Cnawait%20uploadFile(%7B%5Cn%20%20repo%3A%20%5C%22my-user%2Fnlp-model%5C%22%2C%5Cn%20%20accessToken%3A%20HF_TOKEN%2C%5Cn%20%20%2F%2F%20Can%20work%20with%20native%20File%20in%20browsers%5Cn%20%20file%3A%20%7B%5Cn%20%20%20%20path%3A%20%5C%22pytorch_model.bin%5C%22%2C%5Cn%20%20%20%20content%3A%20new%20Blob(...)%5Cn%20%20%7D%5Cn%7D)%3B%5Cn%5Cnawait%20deleteFiles(%7B%5Cn%20%20repo%3A%20%7B%20type%3A%20%5C%22space%5C%22%2C%20name%3A%20%5C%22my-user%2Fmy-space%5C%22%20%7D%2C%20%2F%2F%20or%20%5C%22spaces%2Fmy-user%2Fmy-space%5C%22%5Cn%20%20accessToken%3A%20HF_TOKEN%2C%5Cn%20%20paths%3A%20%5B%5C%22README.md%5C%22%2C%20%5C%22.gitattributes%5C%22%5D%5Cn%7D)%3B%5Cn%60%60%60%5Cn%5Cn%23%23%23%20%40huggingface%2Fmcp-client%20example%5Cn%5Cn%60%60%60ts%5Cnimport%20%7B%20Agent%20%7D%20from%20'%40huggingface%2Fmcp-client'%3B%5Cn%5Cnconst%20HF_TOKEN%20%3D%20%5C%22hf_...%5C%22%3B%5Cn%5Cnconst%20agent%20%3D%20new%20Agent(%7B%5Cn%20%20provider%3A%20%5C%22auto%5C%22%2C%5Cn%20%20model%3A%20%5C%22Qwen%2FQwen2.5-72B-Instruct%5C%22%2C%5Cn%20%20apiKey%3A%20HF_TOKEN%2C%5Cn%20%20servers%3A%20%5B%5Cn%20%20%20%20%7B%5Cn%20%20%20%20%20%20%2F%2F%20Playwright%20MCP%5Cn%20%20%20%20%20%20command%3A%20%5C%22npx%5C%22%2C%5Cn%20%20%20%20%20%20args%3A%20%5B%5C%22%40playwright%2Fmcp%40latest%5C%22%5D%2C%5Cn%20%20%20%20%7D%2C%5Cn%20%20%5D%2C%5Cn%7D)%3B%5Cn%5Cnawait%20agent.loadTools()%3B%5Cn%5Cnfor%20await%20(const%20chunk%20of%20agent.run(%5C%22What%20are%20the%20top%205%20trending%20models%20on%20Hugging%20Face%3F%5C%22))%20%7B%5Cn%20%20%20%20if%20(%5C%22choices%5C%22%20in%20chunk)%20%7B%5Cn%20%20%20%20%20%20%20%20const%20delta%20%3D%20chunk.choices%5B0%5D%3F.delta%3B%5Cn%20%20%20%20%20%20%20%20if%20(delta.content)%20%7B%5Cn%20%20%20%20%20%20%20%20%20%20%20%20console.log(delta.content)%3B%5Cn%20%20%20%20%20%20%20%20%7D%5Cn%20%20%20%20%7D%5Cn%7D%5Cn%60%60%60%5Cn%5CnThere%20are%20more%20features%20of%20course%2C%20check%20each%20library's%20README!%5Cn%5Cn%23%23%20Formatting%20%26%20testing%5Cn%5Cn%60%60%60console%5Cnsudo%20corepack%20enable%5Cnpnpm%20install%5Cn%5Cnpnpm%20-r%20format%3Acheck%5Cnpnpm%20-r%20lint%3Acheck%5Cnpnpm%20-r%20test%5Cn%60%60%60%5Cn%5Cn%23%23%20Building%5Cn%5Cn%60%60%60%5Cnpnpm%20-r%20build%5Cn%60%60%60%5Cn%5CnThis%20will%20generate%20ESM%20and%20CJS%20javascript%20files%20in%20%60packages%2F*%2Fdist%60%2C%20eg%20%60packages%2Finference%2Fdist%2Findex.mjs%60.%5Cn%22%2C%22sha%22%3A%226ee5e37a08077fd9c5bc675b17df005e04307449%22%2C%22replace_content%22%3A%22%3Cp%20align%3D%5C%22center%5C%22%3E%5Cn%20%20%3Cbr%2F%3E%5Cn%20%20%3Cpicture%3E%5Cn%20%20%20%20%3Csource%20media%3D%5C%22(prefers-color-scheme%3A%20dark)%5C%22%20srcset%3D%5C%22https%3A%2F%2Fhuggingface.co%2Fdatasets%2Fhuggingface%2Fdocumentation-images%2Fraw%2Fmain%2Fhuggingfacejs-dark.svg%5C%22%3E%5Cn%20%20%20%20%3Csource%20media%3D%5C%22(prefers-color-scheme%3A%20light)%5C%22%20srcset%3D%5C%22https%3A%2F%2Fhuggingface.co%2Fdatasets%2Fhuggingface%2Fdocumentation-images%2Fraw%2Fmain%2Fhuggingfacejs-light.svg%5C%22%3E%5Cn%20%20%20%20%3Cimg%20alt%3D%5C%22huggingface%20javascript%20library%20logo%5C%22%20src%3D%5C%22https%3A%2F%2Fhuggingface.co%2Fdatasets%2Fhuggingface%2Fdocumentation-images%2Fraw%2Fmain%2Fhuggingfacejs-light.svg%5C%22%20width%3D%5C%22376%5C%22%20height%3D%5C%2259%5C%22%20style%3D%5C%22max-width%3A%20100%25%3B%5C%22%3E%5Cn%20%20%3C%2Fpicture%3E%5Cn%20%20%3Cbr%2F%3E%5Cn%20%20%3Cbr%2F%3E%5Cn%3C%2Fp%3E%5Cn%5Cn%60%60%60ts%5Cn%2F%2F%20Programmatically%20interact%20with%20the%20Hub%5Cn%5Cnawait%20createRepo(%7B%5Cn%20%20repo%3A%20%7B%20type%3A%20%5C%22model%5C%22%2C%20name%3A%20%5C%22my-user%2Fnlp-model%5C%22%20%7D%2C%5Cn%20%20accessToken%3A%20HF_TOKEN%5Cn%7D)%3B%5Cn%5Cnawait%20uploadFile(%7B%5Cn%20%20repo%3A%20%5C%22my-user%2Fnlp-model%5C%22%2C%5Cn%20%20accessToken%3A%20HF_TOKEN%2C%5Cn%20%20%2F%2F%20Can%20work%20with%20native%20File%20in%20browsers%5Cn%20%20file%3A%20%7B%5Cn%20%20%20%20path%3A%20%5C%22pytorch_model.bin%5C%22%2C%5Cn%20%20%20%20content%3A%20new%20Blob(...)%5Cn%20%20%7D%5Cn%7D)%3B%5Cn%5Cn%2F%2F%20Use%20all%20supported%20Inference%20Providers!%5Cn%5Cnawait%20inference.chatCompletion(%7B%5Cn%20%20model%3A%20%5C%22meta-llama%2FLlama-3.1-8B-Instruct%5C%22%2C%5Cn%20%20provider%3A%20%5C%22sambanova%5C%22%2C%20%2F%2F%20or%20together%2C%20fal-ai%2C%20replicate%2C%20cohere%20%E2%80%A6%5Cn%20%20messages%3A%20%5B%5Cn%20%20%20%20%7B%5Cn%20%20%20%20%20%20role%3A%20%5C%22user%5C%22%2C%5Cn%20%20%20%20%20%20content%3A%20%5C%22Hello%2C%20nice%20to%20meet%20you!%5C%22%2C%5Cn%20%20%20%20%7D%2C%5Cn%20%20%5D%2C%5Cn%20%20max_tokens%3A%20512%2C%5Cn%20%20temperature%3A%200.5%2C%5Cn%7D)%3B%5Cn%5Cnawait%20inference.textToImage(%7B%5Cn%20%20model%3A%20%5C%22black-forest-labs%2FFLUX.1-dev%5C%22%2C%5Cn%20%20provider%3A%20%5C%22replicate%5C%22%2C%5Cn%20%20inputs%3A%20%5C%22a%20picture%20of%20a%20green%20bird%5C%22%2C%5Cn%7D)%3B%5Cn%5Cn%2F%2F%20and%20much%20more%E2%80%A6%5Cn%60%60%60%5Cn%5Cn%23%20Hugging%20Face%20JS%20libraries%5Cn%5CnThis%20is%20a%20collection%20of%20JS%20libraries%20to%20interact%20with%20the%20Hugging%20Face%20API%2C%20with%20TS%20types%20included.%5Cn%5Cn-%20%5B%40huggingface%2Finference%5D(%2Flouis_lifu%2Fhuggingface%2Ftree%2Fmain%2Fpackages%2Finference%2FREADME.md)%3A%20Use%20all%20supported%20(serverless)%20Inference%20Providers%20or%20switch%20to%20Inference%20Endpoints%20(dedicated)%20to%20make%20calls%20to%20100%2C000%2B%20Machine%20Learning%20models%5Cn-%20%5B%40huggingface%2Fhub%5D(%2Flouis_lifu%2Fhuggingface%2Ftree%2Fmain%2Fpackages%2Fhub%2FREADME.md)%3A%20Interact%20with%20huggingface.co%20to%20create%20or%20delete%20repos%20and%20commit%20%2F%20download%20files%5Cn-%20%5B%40huggingface%2Fmcp-client%5D(%2Flouis_lifu%2Fhuggingface%2Ftree%2Fmain%2Fpackages%2Fmcp-client%2FREADME.md)%3A%20A%20Model%20Context%20Protocol%20(MCP)%20client%2C%20and%20a%20tiny%20Agent%20library%2C%20built%20on%20top%20of%20InferenceClient.%5Cn-%20%5B%40huggingface%2Fgguf%5D(%2Flouis_lifu%2Fhuggingface%2Ftree%2Fmain%2Fpackages%2Fgguf%2FREADME.md)%3A%20A%20GGUF%20parser%20that%20works%20on%20remotely%20hosted%20files.%5Cn-%20%5B%40huggingface%2Fdduf%5D(%2Flouis_lifu%2Fhuggingface%2Ftree%2Fmain%2Fpackages%2Fdduf%2FREADME.md)%3A%20Similar%20package%20for%20DDUF%20(DDUF%20Diffusers%20Unified%20Format)%5Cn-%20%5B%40huggingface%2Ftasks%5D(%2Flouis_lifu%2Fhuggingface%2Ftree%2Fmain%2Fpackages%2Ftasks%2FREADME.md)%3A%20The%20definition%20files%20and%20source-of-truth%20for%20the%20Hub's%20main%20primitives%20like%20pipeline%20tasks%2C%20model%20libraries%2C%20etc.%5Cn-%20%5B%40huggingface%2Fjinja%5D(%2Flouis_lifu%2Fhuggingface%2Ftree%2Fmain%2Fpackages%2Fjinja%2FREADME.md)%3A%20A%20minimalistic%20JS%20implementation%20of%20the%20Jinja%20templating%20engine%2C%20to%20be%20used%20for%20ML%20chat%20templates.%5Cn-%20%5B%40huggingface%2Fspace-header%5D(%2Flouis_lifu%2Fhuggingface%2Ftree%2Fmain%2Fpackages%2Fspace-header%2FREADME.md)%3A%20Use%20the%20Space%20%60mini_header%60%20outside%20Hugging%20Face%5Cn-%20%5B%40huggingface%2Follama-utils%5D(%2Flouis_lifu%2Fhuggingface%2Ftree%2Fmain%2Fpackages%2Follama-utils%2FREADME.md)%3A%20Various%20utilities%20for%20maintaining%20Ollama%20compatibility%20with%20models%20on%20the%20Hugging%20Face%20Hub.%5Cn-%20%5B%40huggingface%2Ftiny-agents%5D(%2Flouis_lifu%2Fhuggingface%2Ftree%2Fmain%2Fpackages%2Ftiny-agents%2FREADME.md)%3A%20A%20tiny%2C%20model-agnostic%20library%20for%20building%20AI%20agents%20that%20can%20use%20tools.%5Cn%5Cn%5CnWe%20use%20modern%20features%20to%20avoid%20polyfills%20and%20dependencies%2C%20so%20the%20libraries%20will%20only%20work%20on%20modern%20browsers%20%2F%20Node.js%20%3E%3D%2018%20%2F%20Bun%20%2F%20Deno.%5Cn%5CnThe%20libraries%20are%20still%20very%20young%2C%20please%20help%20us%20by%20opening%20issues!%5Cn%5Cn%23%23%20Installation%5Cn%5Cn%23%23%23%20From%20NPM%5Cn%5CnTo%20install%20via%20NPM%2C%20you%20can%20download%20the%20libraries%20as%20needed%3A%5Cn%5Cn%60%60%60bash%5Cnnpm%20install%20%40huggingface%2Finference%5Cnnpm%20install%20%40huggingface%2Fhub%5Cnnpm%20install%20%40huggingface%2Fmcp-client%5Cn%60%60%60%5Cn%5CnThen%20import%20the%20libraries%20in%20your%20code%3A%5Cn%5Cn%60%60%60ts%5Cnimport%20%7B%20InferenceClient%20%7D%20from%20%5C%22%40huggingface%2Finference%5C%22%3B%5Cnimport%20%7B%20createRepo%2C%20commit%2C%20deleteRepo%2C%20listFiles%20%7D%20from%20%5C%22%40huggingface%2Fhub%5C%22%3B%5Cnimport%20%7B%20McpClient%20%7D%20from%20%5C%22%40huggingface%2Fmcp-client%5C%22%3B%5Cnimport%20type%20%7B%20RepoId%20%7D%20from%20%5C%22%40huggingface%2Fhub%5C%22%3B%5Cn%60%60%60%5Cn%5Cn%23%23%23%20From%20CDN%20or%20Static%20hosting%5Cn%5CnYou%20can%20run%20our%20packages%20with%20vanilla%20JS%2C%20without%20any%20bundler%2C%20by%20using%20a%20CDN%20or%20static%20hosting.%20Using%20%5BES%20modules%5D(https%3A%2F%2Fhacks.mozilla.org%2F2018%2F03%2Fes-modules-a-cartoon-deep-dive%2F)%2C%20i.e.%20%60%3Cscript%20type%3D%5C%22module%5C%22%3E%60%2C%20you%20can%20import%20the%20libraries%20in%20your%20code%3A%5Cn%5Cn%60%60%60html%5Cn%3Cscript%20type%3D%5C%22module%5C%22%3E%5Cn%20%20%20%20import%20%7B%20InferenceClient%20%7D%20from%20'https%3A%2F%2Fcdn.jsdelivr.net%2Fnpm%2F%40huggingface%2Finference%404.3.2%2F%2Besm'%3B%5Cn%20%20%20%20import%20%7B%20createRepo%2C%20commit%2C%20deleteRepo%2C%20listFiles%20%7D%20from%20%5C%22https%3A%2F%2Fcdn.jsdelivr.net%2Fnpm%2F%40huggingface%2Fhub%402.3.0%2F%2Besm%5C%22%3B%5Cn%3C%2Fscript%3E%5Cn%60%60%60%5Cn%5Cn%23%23%23%20Deno%5Cn%5Cn%60%60%60ts%5Cn%2F%2F%20esm.sh%5Cnimport%20%7B%20InferenceClient%20%7D%20from%20%5C%22https%3A%2F%2Fesm.sh%2F%40huggingface%2Finference%5C%22%5Cn%5Cnimport%20%7B%20createRepo%2C%20commit%2C%20deleteRepo%2C%20listFiles%20%7D%20from%20%5C%22https%3A%2F%2Fesm.sh%2F%40huggingface%2Fhub%5C%22%5Cn%2F%2F%20or%20npm%3A%5Cnimport%20%7B%20InferenceClient%20%7D%20from%20%5C%22npm%3A%40huggingface%2Finference%5C%22%5Cn%5Cnimport%20%7B%20createRepo%2C%20commit%2C%20deleteRepo%2C%20listFiles%20%7D%20from%20%5C%22npm%3A%40huggingface%2Fhub%5C%22%5Cn%60%60%60%5Cn%5Cn%23%23%20Usage%20examples%5Cn%5CnGet%20your%20HF%20access%20token%20in%20your%20%5Baccount%20settings%5D(https%3A%2F%2Fhuggingface.co%2Fsettings%2Ftokens).%5Cn%5Cn%23%23%23%20%40huggingface%2Finference%20examples%5Cn%5Cn%60%60%60ts%5Cnimport%20%7B%20InferenceClient%20%7D%20from%20%5C%22%40huggingface%2Finference%5C%22%3B%5Cn%5Cnconst%20HF_TOKEN%20%3D%20%5C%22hf_...%5C%22%3B%5Cn%5Cnconst%20client%20%3D%20new%20InferenceClient(HF_TOKEN)%3B%5Cn%5Cn%2F%2F%20Chat%20completion%20API%5Cnconst%20out%20%3D%20await%20client.chatCompletion(%7B%5Cn%20%20model%3A%20%5C%22meta-llama%2FLlama-3.1-8B-Instruct%5C%22%2C%5Cn%20%20messages%3A%20%5B%7B%20role%3A%20%5C%22user%5C%22%2C%20content%3A%20%5C%22Hello%2C%20nice%20to%20meet%20you!%5C%22%20%7D%5D%2C%5Cn%20%20max_tokens%3A%20512%5Cn%7D)%3B%5Cnconsole.log(out.choices%5B0%5D.message)%3B%5Cn%5Cn%2F%2F%20Streaming%20chat%20completion%20API%5Cnfor%20await%20(const%20chunk%20of%20client.chatCompletionStream(%7B%5Cn%20%20model%3A%20%5C%22meta-llama%2FLlama-3.1-8B-Instruct%5C%22%2C%5Cn%20%20messages%3A%20%5B%7B%20role%3A%20%5C%22user%5C%22%2C%20content%3A%20%5C%22Hello%2C%20nice%20to%20meet%20you!%5C%22%20%7D%5D%2C%5Cn%20%20max_tokens%3A%20512%5Cn%7D))%20%7B%5Cn%20%20console.log(chunk.choices%5B0%5D.delta.content)%3B%5Cn%7D%5Cn%5Cn%2F%2F%2F%20Using%20a%20third-party%20provider%3A%5Cnawait%20client.chatCompletion(%7B%5Cn%20%20model%3A%20%5C%22meta-llama%2FLlama-3.1-8B-Instruct%5C%22%2C%5Cn%20%20messages%3A%20%5B%7B%20role%3A%20%5C%22user%5C%22%2C%20content%3A%20%5C%22Hello%2C%20nice%20to%20meet%20you!%5C%22%20%7D%5D%2C%5Cn%20%20max_tokens%3A%20512%2C%5Cn%20%20provider%3A%20%5C%22sambanova%5C%22%2C%20%2F%2F%20or%20together%2C%20fal-ai%2C%20replicate%2C%20cohere%20%E2%80%A6%5Cn%7D)%5Cn%5Cnawait%20client.textToImage(%7B%5Cn%20%20model%3A%20%5C%22black-forest-labs%2FFLUX.1-dev%5C%22%2C%5Cn%20%20inputs%3A%20%5C%22a%20picture%20of%20a%20green%20bird%5C%22%2C%5Cn%20%20provider%3A%20%5C%22fal-ai%5C%22%2C%5Cn%7D)%5Cn%5Cn%5Cn%5Cn%2F%2F%20You%20can%20also%20omit%20%5C%22model%5C%22%20to%20use%20the%20recommended%20model%20for%20the%20task%5Cnawait%20client.translation(%7B%5Cn%20%20inputs%3A%20%5C%22My%20name%20is%20Wolfgang%20and%20I%20live%20in%20Amsterdam%5C%22%2C%5Cn%20%20parameters%3A%20%7B%5Cn%20%20%20%20src_lang%3A%20%5C%22en%5C%22%2C%5Cn%20%20%20%20tgt_lang%3A%20%5C%22fr%5C%22%2C%5Cn%20%20%7D%2C%5Cn%7D)%3B%5Cn%5Cn%2F%2F%20pass%20multimodal%20files%20or%20URLs%20as%20inputs%5Cnawait%20client.imageToText(%7B%5Cn%20%20model%3A%20'nlpconnect%2Fvit-gpt2-image-captioning'%2C%5Cn%20%20data%3A%20await%20(await%20fetch('https%3A%2F%2Fpicsum.photos%2F300%2F300')).blob()%2C%5Cn%7D)%5Cn%5Cn%2F%2F%20Using%20your%20own%20dedicated%20inference%20endpoint%3A%20https%3A%2F%2Fhf.co%2Fdocs%2Finference-endpoints%2F%5Cnconst%20gpt2Client%20%3D%20client.endpoint('https%3A%2F%2Fxyz.eu-west-1.aws.endpoints.huggingface.cloud%2Fgpt2')%3B%5Cnconst%20%7B%20generated_text%20%7D%20%3D%20await%20gpt2Client.textGeneration(%7B%20inputs%3A%20'The%20answer%20to%20the%20universe%20is'%20%7D)%3B%5Cn%5Cn%2F%2F%20Chat%20Completion%5Cnconst%20llamaEndpoint%20%3D%20client.endpoint(%5Cn%20%20%5C%22https%3A%2F%2Frouter.huggingface.co%2Fhf-inference%2Fmodels%2Fmeta-llama%2FLlama-3.1-8B-Instruct%5C%22%5Cn)%3B%5Cnconst%20out%20%3D%20await%20llamaEndpoint.chatCompletion(%7B%5Cn%20%20model%3A%20%5C%22meta-llama%2FLlama-3.1-8B-Instruct%5C%22%2C%5Cn%20%20messages%3A%20%5B%7B%20role%3A%20%5C%22user%5C%22%2C%20content%3A%20%5C%22Hello%2C%20nice%20to%20meet%20you!%5C%22%20%7D%5D%2C%5Cn%20%20max_tokens%3A%20512%2C%5Cn%7D)%3B%5Cnconsole.log(out.choices%5B0%5D.message)%3B%5Cn%60%60%60%5Cn%5Cn%23%23%23%20%40huggingface%2Fhub%20examples%5Cn%5Cn%60%60%60ts%5Cnimport%20%7B%20createRepo%2C%20uploadFile%2C%20deleteFiles%20%7D%20from%20%5C%22%40huggingface%2Fhub%5C%22%3B%5Cn%5Cnconst%20HF_TOKEN%20%3D%20%5C%22hf_...%5C%22%3B%5Cn%5Cnawait%20createRepo(%7B%5Cn%20%20repo%3A%20%5C%22my-user%2Fnlp-model%5C%22%2C%20%2F%2F%20or%20%7B%20type%3A%20%5C%22model%5C%22%2C%20name%3A%20%5C%22my-user%2Fnlp-test%5C%22%20%7D%2C%5Cn%20%20accessToken%3A%20HF_TOKEN%5Cn%7D)%3B%5Cn%5Cnawait%20uploadFile(%7B%5Cn%20%20repo%3A%20%5C%22my-user%2Fnlp-model%5C%22%2C%5Cn%20%20accessToken%3A%20HF_TOKEN%2C%5Cn%20%20%2F%2F%20Can%20work%20with%20native%20File%20in%20browsers%5Cn%20%20file%3A%20%7B%5Cn%20%20%20%20path%3A%20%5C%22pytorch_model.bin%5C%22%2C%5Cn%20%20%20%20content%3A%20new%20Blob(...)%5Cn%20%20%7D%5Cn%7D)%3B%5Cn%5Cnawait%20deleteFiles(%7B%5Cn%20%20repo%3A%20%7B%20type%3A%20%5C%22space%5C%22%2C%20name%3A%20%5C%22my-user%2Fmy-space%5C%22%20%7D%2C%20%2F%2F%20or%20%5C%22spaces%2Fmy-user%2Fmy-space%5C%22%5Cn%20%20accessToken%3A%20HF_TOKEN%2C%5Cn%20%20paths%3A%20%5B%5C%22README.md%5C%22%2C%20%5C%22.gitattributes%5C%22%5D%5Cn%7D)%3B%5Cn%60%60%60%5Cn%5Cn%23%23%23%20%40huggingface%2Fmcp-client%20example%5Cn%5Cn%60%60%60ts%5Cnimport%20%7B%20Agent%20%7D%20from%20'%40huggingface%2Fmcp-client'%3B%5Cn%5Cnconst%20HF_TOKEN%20%3D%20%5C%22hf_...%5C%22%3B%5Cn%5Cnconst%20agent%20%3D%20new%20Agent(%7B%5Cn%20%20provider%3A%20%5C%22auto%5C%22%2C%5Cn%20%20model%3A%20%5C%22Qwen%2FQwen2.5-72B-Instruct%5C%22%2C%5Cn%20%20apiKey%3A%20HF_TOKEN%2C%5Cn%20%20servers%3A%20%5B%5Cn%20%20%20%20%7B%5Cn%20%20%20%20%20%20%2F%2F%20Playwright%20MCP%5Cn%20%20%20%20%20%20command%3A%20%5C%22npx%5C%22%2C%5Cn%20%20%20%20%20%20args%3A%20%5B%5C%22%40playwright%2Fmcp%40latest%5C%22%5D%2C%5Cn%20%20%20%20%7D%2C%5Cn%20%20%5D%2C%5Cn%7D)%3B%5Cn%5Cnawait%20agent.loadTools()%3B%5Cn%5Cnfor%20await%20(const%20chunk%20of%20agent.run(%5C%22What%20are%20the%20top%205%20trending%20models%20on%20Hugging%20Face%3F%5C%22))%20%7B%5Cn%20%20%20%20if%20(%5C%22choices%5C%22%20in%20chunk)%20%7B%5Cn%20%20%20%20%20%20%20%20const%20delta%20%3D%20chunk.choices%5B0%5D%3F.delta%3B%5Cn%20%20%20%20%20%20%20%20if%20(delta.content)%20%7B%5Cn%20%20%20%20%20%20%20%20%20%20%20%20console.log(delta.content)%3B%5Cn%20%20%20%20%20%20%20%20%7D%5Cn%20%20%20%20%7D%5Cn%7D%5Cn%60%60%60%5Cn%5CnThere%20are%20more%20features%20of%20course%2C%20check%20each%20library's%20README!%5Cn%5Cn%23%23%20Formatting%20%26%20testing%5Cn%5Cn%60%60%60console%5Cnsudo%20corepack%20enable%5Cnpnpm%20install%5Cn%5Cnpnpm%20-r%20format%3Acheck%5Cnpnpm%20-r%20lint%3Acheck%5Cnpnpm%20-r%20test%5Cn%60%60%60%5Cn%5Cn%23%23%20Building%5Cn%5Cn%60%60%60%5Cnpnpm%20-r%20build%5Cn%60%60%60%5Cn%5CnThis%20will%20generate%20ESM%20and%20CJS%20javascript%20files%20in%20%60packages%2F*%2Fdist%60%2C%20eg%20%60packages%2Finference%2Fdist%2Findex.mjs%60.%5Cn%22%7D"},"zoneReducer":{"zoneDetail":"","newsDetail":""}}
Hugging Face JS libraries
This is a collection of JS libraries to interact with the Hugging Face API, with TS types included.
mini_header
outside Hugging FaceWe use modern features to avoid polyfills and dependencies, so the libraries will only work on modern browsers / Node.js >= 18 / Bun / Deno.
The libraries are still very young, please help us by opening issues!
Installation
From NPM
To install via NPM, you can download the libraries as needed:
Then import the libraries in your code:
From CDN or Static hosting
You can run our packages with vanilla JS, without any bundler, by using a CDN or static hosting. Using ES modules, i.e.
<script type="module">
, you can import the libraries in your code:Deno
Usage examples
Get your HF access token in your account settings.
@huggingface/inference examples
@huggingface/hub examples
@huggingface/mcp-client example
There are more features of course, check each library’s README!
Formatting & testing
Building
This will generate ESM and CJS javascript files in
packages/*/dist
, egpackages/inference/dist/index.mjs
.