NvkAnirudh LinkedIn Post Generator

NvkAnirudh LinkedIn Post Generator avatar

by NvkAnirudh

A Model Context Protocol (MCP) server that automates generating LinkedIn post drafts from YouTube videos. This server provides high-quality, editable content drafts based on YouTube video transcripts.

What is NvkAnirudh LinkedIn Post Generator

LinkedIn Post Generator

*smithery badge*

A Model Context Protocol (MCP) server that automates generating professional LinkedIn post drafts from YouTube videos. This tool streamlines content repurposing by extracting transcripts from YouTube videos, summarizing the content, and generating engaging LinkedIn posts tailored to your preferences.

Table of Contents

Features

  • YouTube Transcript Extraction: Automatically extract transcripts from any YouTube video
  • Content Summarization: Generate concise summaries with customizable tone and target audience
  • LinkedIn Post Generation: Create professional LinkedIn posts with customizable style and tone
  • All-in-One Workflow: Go from YouTube URL to LinkedIn post in a single operation
  • Customization Options: Adjust tone, audience, word count, and more to match your personal brand
  • MCP Integration: Works seamlessly with AI assistants that support the Model Context Protocol

Installation

Local Development

  1. Clone the repository:

    git clone https://github.com/NvkAnirudh/LinkedIn-Post-Generator.git
    cd LinkedIn-Post-Generator
    
  2. Install dependencies:

    npm install
    
  3. Create a .env file based on the example:

    cp .env.example .env
    
  4. Add your API keys to the .env file:

    OPENAI_API_KEY=your_openai_api_key
    YOUTUBE_API_KEY=your_youtube_api_key
    
  5. Run the server:

    npm run dev
    
  6. Test with MCP Inspector:

    npm run inspect
    

Using with Claude Desktop

This MCP server is designed to work with Claude Desktop and other AI assistants that support the Model Context Protocol. To use it with Claude Desktop:

  1. Configure Claude Desktop by editing the configuration file at ~/Library/Application Support/Claude/claude_desktop_config.json (Mac) or %APPDATA%\Claude\claude_desktop_config.json (Windows):

    {
      "mcpServers": {
        "linkedin-post-generator": {
          "command": "npx",
          "args": [
            "-y",
            "@smithery/cli@latest",
            "run",
            "@NvkAnirudh/linkedin-post-generator",
            "--key",
            "YOUR_SMITHERY_API_KEY",
            "--config",
            "{\"OPENAI_API_KEY\":\"YOUR_OPENAI_API_KEY\",\"YOUTUBE_API_KEY\":\"YOUR_YOUTUBE_API_KEY\"}",
            "--transport",
            "stdio"
          ]
        }
      }
    }
    

    Replace:

    • YOUR_SMITHERY_API_KEY with your Smithery API key
    • YOUR_OPENAI_API_KEY with your OpenAI API key
    • YOUR_YOUTUBE_API_KEY with your YouTube API key (optional)
  2. Restart Claude Desktop

  3. In Claude Desktop, you can now access the LinkedIn Post Generator tools without needing to set API keys again

Configuration

The application requires API keys to function properly:

  1. OpenAI API Key (required): Used for content summarization and post generation
  2. YouTube API Key (optional): Enhances YouTube metadata retrieval

You can provide these keys in three ways:

1. Via Claude Desktop Configuration (Recommended)

When using with Claude Desktop and Smithery, the best approach is to include your API keys in the Claude Desktop configuration file as shown in the Using with Claude Desktop section. This way, the keys are automatically passed to the MCP server, and you don't need to set them again.

2. As Environment Variables

When running locally, you can set API keys as environment variables in a .env file:

OPENAI_API_KEY=your_openai_api_key
YOUTUBE_API_KEY=your_youtube_api_key

3. Using the Set API Keys Tool

If you haven't provided API keys through the configuration or environment variables, you can set them directly through the MCP interface using the set_api_keys tool.

Usage

Available Tools

Set API Keys

  • Tool: set_api_keys
  • Purpose: Configure your API keys
  • Parameters:
    • openaiApiKey: Your OpenAI API key (required)
    • youtubeApiKey: Your YouTube API key (optional)

Check API Keys

  • Tool: check_api_keys
  • Purpose: Verify your API key configuration status

Extract Transcript

  • Tool: extract_transcript
  • Purpose: Get the transcript from a YouTube video
  • Parameters:
    • youtubeUrl: URL of the YouTube video

Summarize Transcript

  • Tool: summarize_transcript
  • Purpose: Create a concise summary of the video content
  • Parameters:
    • transcript: The video transcript text
    • tone: Educational, inspirational, professional, or conversational
    • audience: General, technical, business, or academic
    • wordCount: Approximate word count for the summary (100-300)

Generate LinkedIn Post

  • Tool: generate_linkedin_post
  • Purpose: Create a LinkedIn post from a summary
  • Parameters:
    • summary: Summary of the video content
    • videoTitle: Title of the YouTube video
    • speakerName: Name of the speaker (optional)
    • hashtags: Relevant hashtags (optional)
    • tone: First-person, third-person, or thought-leader
    • includeCallToAction: Whether to include a call to action

All-in-One: YouTube to LinkedIn Post

  • Tool: youtube_to_linkedin_post
  • Purpose: Complete workflow from YouTube URL to LinkedIn post
  • Parameters:
    • youtubeUrl: YouTube video URL
    • tone: Desired tone for the post
    • Plus additional customization options

Workflow Example

  1. Set your API keys using the set_api_keys tool
  2. Use the youtube_to_linkedin_post tool with a YouTube URL
  3. Receive a complete LinkedIn post draft ready to publish

Deployment

This server is deployed on Smithery, a platform for hosting and sharing MCP servers. The deployment configuration is defined in the smithery.yaml file.

To deploy your own instance:

  1. Create an account on Smithery
  2. Install the Smithery CLI:
    npm install -g @smithery/cli
    
  3. Deploy the server:
    smithery deploy
    

Contributing

Contributions are welcome and appreciated! Here's how you can contribute to the LinkedIn Post Generator:

Reporting Issues

  • Use the GitHub issue tracker to report bugs or suggest features
  • Please provide detailed information about the issue, including steps to reproduce, expected behavior, and actual behavior
  • Include your environment details (OS, Node.js version, etc.) when reporting bugs

Pull Requests

  1. Fork the repository
  2. Create a new branch (git checkout -b feature/your-feature-name)
  3. Make your changes
  4. Run tests to ensure your changes don't break existing functionality
  5. Commit your changes (git commit -m 'Add some feature')
  6. Push to the branch (git push origin feature/your-feature-name)
  7. Open a Pull Request

Development Guidelines

  • Follow the existing code style and conventions
  • Write clear, commented code
  • Include tests for new features
  • Update documentation to reflect your changes

Feature Suggestions

If you have ideas for new features or improvements:

  1. Check existing issues to see if your suggestion has already been proposed
  2. If not, open a new issue with the label 'enhancement'
  3. Clearly describe the feature and its potential benefits

Documentation

Improvements to documentation are always welcome:

  • Fix typos or clarify existing documentation
  • Add examples or use cases
  • Improve the structure or organization of the documentation

By contributing to this project, you agree that your contributions will be licensed under the project's MIT License.

License

MIT

Leave a Comment

Frequently Asked Questions

What is MCP?

MCP (Model Context Protocol) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications, providing a standardized way to connect AI models to different data sources and tools.

What are MCP Servers?

MCP Servers are lightweight programs that expose specific capabilities through the standardized Model Context Protocol. They act as bridges between LLMs like Claude and various data sources or services, allowing secure access to files, databases, APIs, and other resources.

How do MCP Servers work?

MCP Servers follow a client-server architecture where a host application (like Claude Desktop) connects to multiple servers. Each server provides specific functionality through standardized endpoints and protocols, enabling Claude to access data and perform actions through the standardized protocol.

Are MCP Servers secure?

Yes, MCP Servers are designed with security in mind. They run locally with explicit configuration and permissions, require user approval for actions, and include built-in security features to prevent unauthorized access and ensure data privacy.

Related MCP Servers

chrisdoc hevy mcp avatar

chrisdoc hevy mcp

mcp
sylphlab pdf reader mcp avatar

sylphlab pdf reader mcp

An MCP server built with Node.js/TypeScript that allows AI agents to securely read PDF files (local or URL) and extract text, metadata, or page counts. Uses pdf-parse.

pdf-parsetypescriptnodejs
aashari mcp server atlassian bitbucket avatar

aashari mcp server atlassian bitbucket

Node.js/TypeScript MCP server for Atlassian Bitbucket. Enables AI systems (LLMs) to interact with workspaces, repositories, and pull requests via tools (list, get, comment, search). Connects AI directly to version control workflows through the standard MCP interface.

atlassianrepositorymcp
aashari mcp server atlassian confluence avatar

aashari mcp server atlassian confluence

Node.js/TypeScript MCP server for Atlassian Confluence. Provides tools enabling AI systems (LLMs) to list/get spaces & pages (content formatted as Markdown) and search via CQL. Connects AI seamlessly to Confluence knowledge bases using the standard MCP interface.

atlassianmcpconfluence
prisma prisma avatar

prisma prisma

Next-generation ORM for Node.js & TypeScript | PostgreSQL, MySQL, MariaDB, SQL Server, SQLite, MongoDB and CockroachDB

cockroachdbgomcp
Zzzccs123 mcp sentry avatar

Zzzccs123 mcp sentry

mcp sentry for typescript sdk

mcptypescript
zhuzhoulin dify mcp server avatar

zhuzhoulin dify mcp server

mcp
zhongmingyuan mcp my mac avatar

zhongmingyuan mcp my mac

mcp
zhixiaoqiang desktop image manager mcp avatar

zhixiaoqiang desktop image manager mcp

MCP 服务器,用于管理桌面图片、查看详情、压缩、移动等(完全让Trae实现)

mcp
zhixiaoqiang antd components mcp avatar

zhixiaoqiang antd components mcp

An MCP service for Ant Design components query | 一个减少 Ant Design 组件代码生成幻觉的 MCP 服务,包含系统提示词、组件文档、API 文档、代码示例和更新日志查询

designantdapi

Submit Your MCP Server

Share your MCP server with the community

Submit Now