How Factory used LangSmith to automate their feedback loop and improve iteration speed by 2x

The Nugget

  • Factory used LangSmith to automate the feedback loop and enhance debugging, resulting in a 2x increase in iteration speed.

Make it stick

  • 🤖 Factory's AI Droids increase engineering productivity across SDLC stages.
  • 🔍 LangSmith's custom tracing allowed precise tracking and debugging.
  • 🛡️ Self-hosted LangSmith ensures data privacy and seamless complex LLM observability.
  • 🚀 Reduced iteration speed by 2x and 20% reduction in open-to-merge time using automated feedback loops.

Key insights

Leveraging LangSmith for secure and reliable AI operations

  • Factory's Code Droid excels in complex software development tasks by automating SDLC stages with LangSmith.
  • Traditional observability tools were inadequate for Factory’s needs; LangSmith provided the nuanced tracking and debugging required.
  • Integration of LangSmith with AWS CloudWatch logs ensured precise data flow tracking.
  • LangSmith enabled seamless debugging of context-awareness issues in LLM responses, directly tying feedback to each call.

Closing the product feedback loop with LangSmith

  • LangSmith’s Feedback API enabled quicker, more accurate prompt optimization by automating the feedback collection process.
  • By linking feedback directly to workflows and exporting data for analysis, Factory enhanced their LLM accuracy.
  • This improvement led to significant enhancements in efficiency, halving the iteration time and reducing cognitive load and infrastructure needs for feedback analysis.
  • Factory’s Droids reduced customer open-to-merge time by ~20% and decreased code churn 3x over the first 90 days.

Looking forward: Expanding AI autonomy in the SDLC

  • Factory's integration of LangSmith has saved clients over 550,000 development hours.
  • A 20% reduction in cycle time allows engineers to focus on innovative, value-added tasks.
  • Factory has secured $15 million in Series A funding, driving future advancements in AI autonomation within SDLC.

Key quotes

  • “Our collaboration with LangChain has been critical to successfully deploying enterprise LLM-based systems. We are significantly more confident in our decision making and operational capabilities thanks to the observability and orchestration-layer tooling that we get from the LangChain team.” – Eno Reyes, CTO of Factory
This summary contains AI-generated information and may have important inaccuracies or omissions.