Recap—Building No-Code Chatbots for MERL: Version 2.0


earn how to build custom chatbots for MERL without coding! This webinar recap covers step-by-step instructions, real-world applications, and best practices for creating chatbots using ChatGPT and Hugging Face platforms.

On October 16th, the Sandbox Working Group hosted a follow-up session to one of our most popular webinars of 2024, demonstrating how to build custom chatbots to enhance monitoring, evaluation, research, and learning (MERL) activities. Co-facilitated by Sarah Osman and Zach Tilton, the session showcased how non-technical practitioners can leverage AI tools to create powerful, purpose-built chatbots without writing code.

Key Highlights

Platforms and Accessibility

  • OpenAI’s GPT Platform: Previously requiring a Plus subscription, custom GPTs are now accessible to users with free accounts (though building custom GPTs still requires Plus). Allows background documentation uploads. 
  • Hugging Face Assistants: Continues to offer a free, open-source alternative with customizable LLM selection.

Real-World Applications

The webinar showcased several practical applications of custom chatbots in MERL:

  1. MERL Tech-nician: A specialized assistant for digital technology in evaluation
  2. Theory of Change Workshop Support: Custom chatbots supporting youth engagement in participatory evaluation
  3. Resource Summarization: Automated extraction and summarization of evaluation resources for Better Evaluation
  4. Advanced Features: Demonstration of new capabilities like API integrations for enhanced functionality

Building a Custom Chatbot: Step-by-Step

  1. Choose Your Platform
    • ChatGPT (requires Plus subscription for building)
    • Hugging Face (free, open-source alternative)
  2. Define Your Chatbot’s Purpose
    • Specify clear objectives and use cases
    • Consider your intended users
  3. Configure Your Assistant
  4. Test and Refine
    • Start with sample queries
    • Iterate based on performance
    • Update system prompts as needed

Tips and Best Practices

  1. Document Management
    • Save your system prompts separately as backup
    • Consider data privacy when uploading documents
    • Use public or anonymized data for testing
  2. System Prompt Engineering
    • Be specific about the chatbot’s role and limitations
    • Include example interactions
    • Define preferred response formats
  3. Quality Assurance
    • Compare responses with general ChatGPT
    • Test edge cases
    • Validate outputs against source materials

Resources

Next Steps

Register here to join us on November 19th for a show-and-tell session featuring hackathon participants and their creative chatbot solutions. Take some time to rate the hackathon submissions. December will bring our year-in-review discussion, where we’ll reflect on the journey and gather community feedback for future directions.

This blog post is part of our ongoing series that recaps monthly webinars on AI applications in MERL. Stay tuned for more updates and insights from the Sandbox Working Group.

Leave a Reply

Your email address will not be published. Required fields are marked *