Langflow’s cover photo
Langflow

Langflow

Software Development

Uberlândia, Minas Gerais 14,630 followers

Langflow is a low-code app builder for RAG and multi-agent AI applications. It’s Python-based and agnostic to any model,

About us

Langflow is a new, visual way to build, iterate and deploy AI apps.

Website
https://www.langflow.org/
Industry
Software Development
Company size
11-50 employees
Headquarters
Uberlândia, Minas Gerais
Type
Self-Owned
Founded
2020
Specialties
AI, Generative AI, GenAI, RAG, and Machine Learning

Locations

Employees at Langflow

Updates

  • Turn your AI agent into a Notion assistant With Composio components in Langflow, you can connect Notion directly into your flow and turn scattered inputs into structured, reliable knowledge. Instead of manually managing pages and databases, your agent can: - Create structured pages for meetings, projects, and documentation - Update existing content as new information comes in - Organize data into databases with consistent formatting - Generate summaries, insights, and action items automatically - Keep your workspace clean, searchable, and always up to date All visually. All with drag and drop. One powerful example is automated knowledge structuring. Your agent can transform raw inputs into well-structured Notion pages, organizing information into clear sections, extracting key points, and generating action items or documentation automatically. Over time, this becomes a living knowledge base that stays consistent and continuously updated, without manual effort. 👉 Learn more about Langflow: https://lnkd.in/dfcF-i5A

    • No alternative text description for this image
  • Langflow Use Case: Marketing Content Generator Turn a single brief into high-quality, research-backed marketing content across channels. This Langflow workflow uses an AI agent to gather real-time insights, understand your audience, and generate tailored content for social media, blogs, emails, and more. Combine structured briefings with live research to create more relevant, consistent, and scalable marketing outputs. Ready for real production use. 🔗 Template: https://lnkd.in/d5YseSqk

    • No alternative text description for this image
  • View organization page for Langflow

    14,630 followers

    Chat Refactor (Playground Improvements) - Langflow 1.8 Faster, more reliable chat for real-world workflows. Working with long conversations, streaming responses, and rich metadata shouldn’t slow down the interface or break the experience. With the launch of Langflow 1.8, the chat and playground experience has been refactored with a new messaging architecture designed for performance and reliability: - Improved session and message lifecycle management; - Better handling of long histories and complex message data; - Reduced UI lag during streaming and extended conversations; Example: Running long, multi-turn conversations with continuous streaming now feels smooth and responsive, even as message history and metadata grow. You can also keep building your flow while testing it in parallel. Why it matters: - More stable interactions during development and testing - Faster iteration with real-time feedback inside the builder - Better performance for long and complex chat workflows Upgrade to Langflow 1.8 and experience a faster, more responsive chat and playground. https://lnkd.in/dxrCyaM5

    • No alternative text description for this image
  • 🔹 Agentics — Langflow 1.8 Building workflows that coordinate tools, transform structured data, and run multi-step operations shouldn’t require complex external frameworks or custom orchestration layers. With the launch of Langflow 1.8, IBM’s open-source Agentics framework is integrated directly into the platform, introducing a structured and typed approach to AI workflows: - Tool-driven execution across multiple steps - Typed data transformations and generation across structured workflows - LLM-powered transformations (transductions) applied directly to structured data - Parallel execution through async batching and map/reduce-style operations How the core operations work: aMap (N → N) Transforms each row independently using an LLM. Ideal for tasks like classification, enrichment, or extracting structured fields from unstructured text. aReduce (N → 1) Aggregates multiple rows into a single structured output. Useful for summarization, grouping insights, or generating reports from datasets. aGenerate (0 → N) Creates new rows based on a defined schema. Enables synthetic data generation, structured outputs, or dataset expansion. Example: A workflow can process thousands of product reviews, classify sentiment per row (aMap), aggregate key insights (aReduce), and generate structured reports (aGenerate), all inside a single pipeline. Why it matters: - More predictable and controlled data transformations - Built-in validation and structured outputs (no fragile parsing) - Higher throughput with async batching and parallel execution - Better traceability and reliability for AI-driven data workflows ⭐ If you liked it, consider starring the Agentics repository: https://lnkd.in/dJZupv4g 👉 Learn more about Agentics: https://lnkd.in/dezQ9aSH

  • Knowledge Bases - Langflow 1.8 Store and query knowledge directly inside Langflow. Building AI workflows that rely on documents, datasets, or internal knowledge shouldn’t require complex custom retrieval pipelines. With the launch of Langflow 1.8, Knowledge Bases introduce local vector databases inside Langflow, making it easier to store, retrieve, and reuse information across workflows: - Documents and datasets can be indexed and stored as vector data; - Workflows can retrieve relevant context directly from the knowledge base; - Multiple flows can access the same knowledge source; Example: A workflow can query a knowledge base of documents, retrieve relevant context, and use that information to generate more accurate responses. Why it matters: - Simplifies building retrieval-augmented workflows; - Makes it easier to work with documents and datasets; - Keeps vector data accessible directly within Langflow; 👉 Upgrade to Langflow 1.8 and start building workflows powered by your own knowledge. https://lnkd.in/dxKmNdRt

    • No alternative text description for this image
  • 🔹 Inspection Panel - Langflow 1.8 Working with visual workflows shouldn’t require switching between the builder and external logs just to inspect component behavior. With the launch of Langflow 1.8, the new Inspection Panel makes it easier to inspect how individual components behave during execution: - Direct access to component inputs, outputs, parameters, and internal state; - Inspection during or after execution without leaving the flow builder; - Clearer visibility into how data moves between nodes; Example: When a flow produces an unexpected result, you can click a component and inspect the data it received and produced directly in the workspace. Why it matters: - Better visibility into component behavior; - Faster understanding of data flow between nodes; - Less reliance on external logs or print statements; 👉 Upgrade to Langflow 1.8 and inspect component behavior directly in the workspace. https://lnkd.in/drKznDkU

    • No alternative text description for this image
  • 🔹 API Redesign (Phase 1) - Lanflow 1.8 Integrating Langflow into applications shouldn’t require working around inconsistent endpoints or hard-to-predict request formats. With the launch of Langflow 1.8, workflow execution moves toward a more standardized and predictable API structure: - Introduction of V2 workflow endpoints (beta); - More consistent REST-style request and response schemas; - Cleaner foundation for running workflows via API in real applications; Example: Instead of embedding the flow ID directly in the endpoint path, workflows are now executed through the /api/v2/workflows endpoint using a structured request body, making integrations easier to read, maintain, and scale. Why it matters: - Reduces integration complexity - Improves reliability for programmatic integrations - Makes Langflow a more predictable part of application and backend architectures 👉 Explore Langflow 1.8: https://lnkd.in/dcPtKEqY

    • No alternative text description for this image
  • Global Model Provider - Langflow 1.8 Configure model providers once reuse across your canvas. Configuring AI models shouldn’t require repeating API keys and provider settings across every component in a workflow. With the launch of Langflow 1.8, model provider configuration becomes centralized and reusable: - Providers are configured once at the platform level; - Smart components reference a shared provider instead of raw credentials; - Updating credentials or switching providers becomes a single change; Example: Rotating an API key or switching providers can now be done in one place, without reconfiguring credentials across multiple components. Why it matters: - Reduces setup time - Eliminates configuration drift - Improves security by centralizing secret management Learn more about this release: https://lnkd.in/dP_hp_JX

    • No alternative text description for this image
  • 🚀 Langflow V 1.8 is live Langflow 1.8 represents a structural leap in how AI solutions are built, integrated, and scaled. This release makes Langflow more mature, more powerful, and ready for intelligent agents in production, going beyond prototypes and experiments. With 1.8, Langflow is simpler to configure, easier to integrate, faster to use, and prepared for the next generation of AI agents, from visual workflows to production code. What’s new in this release: 🔹 Model Provider Setup Model configuration now follows a single, reusable standard, reducing manual setup, configuration drift, and errors when scaling projects. 🔹 API Redesign (Phase 1) Flows can now be consumed via standardized APIs, making Langflow a more predictable and robust part of applications, systems, and backends. 🔹 Chat Refactor (Playground Improvements) Improved session and message management delivers more stable interactions and a smoother experience when working with long or complex conversations. 🔹 Inspection Panel Direct access to component configuration, parameters, and optional inputs directly from the workspace panel, reducing context switching and accelerating debugging and iteration. 🔹 Knowledge Bases Built-in knowledge bases act as local vector databases inside Langflow, making it easier to store and retrieve documents and datasets while enabling retrieval-augmented workflows. 🔹 Traces Trace support provides deeper visibility into workflow execution, helping developers follow execution paths, measure latency, track token usage, and debug complex flows more easily. 🔹 Agentics Adds structured data workflow components, including N→N transformations (aMap), N→1 aggregations (aReduce), 0→N generation (aGenerate), and DataFrame merging without LLM calls, unlocking practical use cases like data enrichment and aggregation. 👉 Explore Langflow 1.8 and start building production-ready AI agents: https://lnkd.in/dBtZryav

    • No alternative text description for this image

Similar pages

Browse jobs