Skip to content

flutter/ai

package:flutter_ai_toolkit pub package package publisher

Hello and welcome to the Flutter AI Toolkit!

The AI Toolkit is a set of AI chat-related widgets to make it easy to add an AI chat window to your Flutter app. The AI Toolkit is organized around an abstract LLM provider API that makes it easy to swap out the LLM provider that you'd like your chat provider to use. Out of the box, it comes with support for two LLM provider integrations: Google Gemini AI and Firebase Vertex AI.

Key features

  • Multi-turn chat: Maintains context across multiple interactions.
  • Streaming responses: Displays AI responses in real-time as they're generated.
  • Rich text display: Supports formatted text in chat messages.
  • Voice input: Allows users to input prompts using speech.
  • Multimedia attachments: Enables sending and receiving various media types.
  • Custom styling: Offers extensive customization to match your app's design.
  • Chat serialization/deserialization: Store and retrieve conversations between app sessions.
  • Pluggable LLM support: Implement a simple interface to plug in your own LLM.
  • Cross-platform support: Compatible with the Android, iOS, web, and macOS platforms.

Online demo

To see the AI toolkit in action, check out the online demo.

Screenshot of a halloween-themed web app demo using the Flutter AI toolkit.

The demo's source code can be a useful reference as you integrate the AI toolkit in to your own applications.

Getting started

Try in IDX
  1. Installation

    Add the following dependencies to your pubspec.yaml file:

    $ flutter pub add flutter_ai_toolkit google_generative_ai firebase_core
  2. Gemini AI configuration

    The AI toolkit supports both Google Gemini AI and Firebase Vertex AI as LLM providers. To use Google Gemini AI, obtain an API key from Google AI Studio. Be careful not to check this key into your source code repository to prevent unauthorized access.

    You'll also need to choose a specific Gemini model name to use in creating an instance of the Gemini model. The following example uses gemini-1.5-flash, but you can choose from an ever-expanding set of models.

    import 'package:flutter/material.dart';
    import 'package:flutter_ai_toolkit/flutter_ai_toolkit.dart';
    import 'package:google_generative_ai/google_generative_ai.dart';
    
    void main() {
      runApp(const MyApp());
    }
    
    class MyApp extends StatelessWidget {
      const MyApp({super.key});
    
      @override
      Widget build(BuildContext context) {
        return MaterialApp(
          title: 'Flutter Demo',
          theme: ThemeData(
            colorScheme: ColorScheme.fromSeed(seedColor: Colors.deepPurple),
            useMaterial3: true,
          ),
          home: const ChatPage(title: 'Generative AI Chat'),
        );
      }
    }
    
    class ChatPage extends StatelessWidget {
      const ChatPage({super.key, required this.title});
      final String title;
    
      @override
      Widget build(BuildContext context) {
        return Scaffold(
          appBar: AppBar(title: Text(title)),
          body: LlmChatView(
            provider: GeminiProvider(
              model: GenerativeModel(
                model: 'gemini-1.5-flash',
                apiKey: 'GEMINI-API-KEY',
              ),
            ),
          ),
        );
      }
    }

    The GenerativeModel class comes from the google_generative_ai package. The AI Toolkit builds on top of this package with the GeminiProvider, which plugs Gemini AI into the LlmChatView, the top-level widget that enables an LLM-based chat conversation with your users.

    For a complete usage example, check out the gemini.dart sample.

  3. Vertex AI configuration

    While Gemini AI is useful for quick prototyping, the recommended solution for production apps is Vertex AI in Firebase. This eliminates the need for an API key in your client app and replaces it with a more secure Firebase project. To use Vertex AI in your project, follow the steps described in Get started using the Vertex AI in Firebase SDKs.

    Once that's complete, integrate the new Firebase project into your Flutter app using the flutterfire CLI tool, as described in [Add Firebase to your Flutter app docs][].

    After following these instructions, you're ready to use Firebase Vertex AI in your Flutter app. Start by initializing Firebase:

    import 'package:firebase_core/firebase_core.dart';
    import 'package:firebase_vertexai/firebase_vertexai.dart';
    import 'package:flutter_ai_toolkit/flutter_ai_toolkit.dart';
    
    // Other app imports...
    
    import 'firebase_options.dart'; // Generated by `flutterfire config`.
    
    void main() async {
      WidgetsFlutterBinding.ensureInitialized();
      await Firebase.initializeApp(options: DefaultFirebaseOptions.currentPlatform);
      runApp(const App());
    }
    
    // Implementation of App and other widgets...

    With Firebase properly initialized in your Flutter app, you're now ready to create an instance of the Vertex provider:

    class ChatPage extends StatelessWidget {
      const ChatPage({super.key});
    
      @override
      Widget build(BuildContext context) => Scaffold(
            appBar: AppBar(title: const Text(App.title)),
            // Create the chat view, passing in the Vertex provider.
            body: LlmChatView(
              provider: VertexProvider(
                chatModel: FirebaseVertexAI.instance.generativeModel(
                  model: 'gemini-1.5-flash',
                ),
              ),
            ),
          );
    }

    The FirebaseVertexAI class comes from the firebase_vertexai package. The AI Toolkit provides the VertexProvider class to expose Vertex AI to the LlmChatView. Note that you provide a model name that corresponds to one of the available model names, but you don't provide an API key. All of that is handled as part of the Firebase project.

    For a complete usage example, check out the vertex.dart sample.

  4. Set up device permissions

    To enable your users to take advantage of features like voice input and media attachments, ensure that your app has the necessary permissions:

    • Microphone access: To enable voice input for users, update configs according to the permission and setup instructions for package:record.
    • File selection: To enable users to select and attach files, follow the usage instructions for package:file_selector.
    • Image selection: To enable users to take or select a picture from their device, refer to the installation instructions for package:image_picker.

Samples

To run the example apps in the example/lib directory, first replace the example/lib/gemini_api_key.dart and example/lib/firebase_options.dart files. They're placeholders for configuration needed by the examples.

gemini_api_key.dart

Most of the example apps rely on a Gemini API key, so for those to work, you'll need to add your API key to the example/lib/gemini_api_key.dart file. You can get an API key in Google AI Studio.

Warning

Be careful not to check the gemini_api_key.dart file into your git repo.

firebase_options.dart

To use the Vertex AI example app, add your Firebase configuration details to the example/lib/firebase_options.dart file. You can do this running the flutterfire CLI tool as described in Add Firebase to your Flutter app from within the example directory.

Warning

Be careful not to check the firebase_options.dart file into your git repo.

Feedback

Along the way, as you use this package, please log issues and feature requests as well as submit any code you'd like to contribute. We want your feedback and contributions to ensure that the AI Toolkit is as robust and useful as it can be for your real-world apps.

About

No description, website, or topics provided.

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published