IBM wxflows
by IBM
Tool platform by IBM to build, test and deploy tools for any data source
What is IBM wxflows
Using watsonx.ai Flows Engine with Model Context Protocol (MCP)
Here's a step-by-step tutorial for setting up and deploying a project with wxflows
, including installing necessary tools, deploying the app, and running it locally.
This example consists of the following pieces:
- MCP TypeScript SDK (mcp server)
- wxflows SDK (tools)
You can use any of the supported MCP clients.
This guide will walk you through installing the wxflows
CLI, initializing and deploying a project, and running the application locally. We’ll use google_books
and wikipedia
tools as examples for tool calling with wxflows
.
Before you start
Clone this repository and open the right directory:
git clone https://github.com/IBM/wxflows.git
cd examples/mcp/javascript
Step 1: Set up wxflows
Before you can start building AI applications using watsonx.ai Flows Engine:
- Sign up for a free account
- Download & install the Node.js CLI
- Authenticate your account
Step 2: Deploy a Flows Engine project
Move into the wxflows
directory:
cd wxflows
There's already a wxflows project for you set up this repository with the following values:
- Defines an endpoint
api/mcp-example
for the project. - Imports
google_books
tool with a description for searching books and specifying fieldsbooks|book
. - Imports
wikipedia
tool with a description for Wikipedia searches and specifying fieldssearch|page
.
You can deploy this tool configuration to a Flows Engine endpoint by running:
wxflows deploy
This command deploys the endpoint and tools defined, these will be used by the wxflows
SDK in your application.
Step 3: Set Up Environment Variables
From the project’s root directory copy the sample environment file to create your .env
file:
cp .env.sample .env
Edit the .env
file and add your credentials, such as API keys and other required environment variables. Ensure the credentials are correct to allow the tools to authenticate and interact with external services.
Step 4: Install Dependencies in the Application
To run the application you need to install the necessary dependencies:
npm i
This command installs all required packages, including the @wxflows/sdk
package and any dependencies specified in the project.
Step 5: Build the MCP server
Build the server by running:
npm run build
Step 6: Use in a MCP client
Finally, you can use the MCP server in a client. To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"wxflows-server": {
"command": "node",
"args": ["/path/to/wxflows-server/build/index.js"],
"env": {
"WXFLOWS_APIKEY": "YOUR_WXFLOWS_APIKEY",
"WXFLOWS_ENDPOINT": "YOUR_WXFLOWS_ENDPOINT"
}
}
}
}
You can now open Claude Desktop and should be seeing the tools from the wxflows-server
listed. You can now test the google_books
and wikipedia
tools through Claude Desktop.
Summary
You’ve now successfully set up, deployed, and run a wxflows
project with google_books
and wikipedia
tools. This setup provides a flexible environment to leverage external tools for data retrieval, allowing you to further build and expand your app with wxflows
. See the instructions in tools to add more tools or create your own tools from Databases, NoSQL, REST or GraphQL APIs.
Support
Please reach out to us on Discord if you have any questions or want to share feedback. We'd love to hear from you!
Installation
To use with Claude Desktop, add the server config:
On MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
On Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"weather-server": {
"command": "/path/to/weather-server/build/index.js"
}
}
}
Debugging
Since MCP servers communicate over stdio, debugging can be challenging. We recommend using the MCP Inspector, which is available as a package script:
npm run inspector
The Inspector will provide a URL to access debugging tools in your browser.
How to Use
Leave a Comment
Comments section will be available soon. Stay tuned!
Frequently Asked Questions
What is MCP?
MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications, providing a standardized way to connect AI models to different data sources and tools.
What are MCP Servers?
MCP Servers are lightweight programs that expose specific capabilities through the standardized Model Context Protocol. They act as bridges between LLMs like Claude and various data sources or services, allowing secure access to files, databases, APIs, and other resources.
How do MCP Servers work?
MCP Servers follow a client-server architecture where a host application (like Claude Desktop) connects to multiple servers. Each server provides specific functionality through standardized endpoints and protocols, enabling Claude to access data and perform actions through the standardized protocol.
Are MCP Servers secure?
Yes, MCP Servers are designed with security in mind. They run locally with explicit configuration and permissions, require user approval for actions, and include built-in security features to prevent unauthorized access and ensure data privacy.
Related MCP Servers
MasterGo MCP Server
MasterGo Magic MCP 是一个独立的 MCP (Model Context Protocol) 服务,用于连接 MasterGo 设计工具与 AI 模型。它允许 AI 模型直接从 MasterGo 设计文件中获取 DSL 数据。
Filesystem MCP Server
A core MCP server that provides filesystem access capabilities for Claude. Enables secure reading, writing, and management of files on your local system with granular permission controls.
Brave Search MCP
Integrate Brave Search capabilities into Claude through MCP. Enables real-time web searches with privacy-focused results and comprehensive web coverage.
Magic MCP
Create crafted UI components inspired by the best 21st.dev design engineers.
Apify
Use 3,000+ pre-built cloud tools to extract data from websites, e-commerce, social media, search engines, maps, and more
mcp server axiom
A Model Context Protocol server implementation for Axiom that enables AI agents to query your data using Axiom Processing Language (APL).Query and analyze your Axiom logs, traces, and all other event data in natural language,
Browserbase
The Model Context Protocol (MCP) is an open protocol that enables seamless integration between LLM applications and external data sources and tools. Whether you're building an AI-powered IDE, enhancing a chat interface, or creating custom AI workflows, MCP provides a standardized way to connect LLMs with the context they need.</br>This server provides cloud browser automation capabilities using Browserbase, Puppeteer, and Stagehand (Coming Soon). This server enables LLMs to interact with web pages, take screenshots, and execute JavaScript in a cloud browser environment.Automate browser interactions in the cloud (e.g. web navigation, data extraction, form filling, and more)
ClickHouse
Query your ClickHouse database server.
Cloudflare
Deploy, configure & interrogate your resources on the Cloudflare developer platform (e.g. Workers/KV/R2/D1)
E2B
Run code in secure sandboxes hosted by E2B
Submit Your MCP Server
Share your MCP server with the community
Submit Now