Contents Menu Expand
Logo
Web (React, Angular, etc.) Objective-C Swift Kotlin Java Flutter Ionic React Native
Go to docs
Logo
Logo

Getting Started

  • About Alan AI
  • Quick start
    • Review the project scope
    • Sign up for Alan AI Studio
    • Create a static corpus
    • Create a dynamic corpus
    • Understand AI reasoning
    • Adjust AI reasoning and output
    • Integrate with the app
    • Customize the Agentic Interface look and feel
    • Add actionable links
    • Analyze user queries
  • Recent Alan AI Platform updates

Alan AI Deployment

  • Alan AI Platform
  • Deployment options
    • SaaS deployment
    • Private cloud/on-premises deployment
      • Alan AI Cloud deployment
      • Troubleshooting
      • Alan AI Helm installation package
      • Alan AI Cloud configuration
        • Alan AI configuration options
        • Enabling GitHub integration
        • Switching the AI model

Alan AI Studio

  • Agentic Interface projects
  • Dialog scripts
    • Managing scripts
    • Exporting and importing scripts
    • Using shortcuts
    • Customizing the code editor
  • Versions and environments
  • Resources
  • Collaboration and CI/CD tools
    • Sharing and keeping scripts in GitHub
    • Setting up CI/CD workflow
  • Testing and debugging
    • Debugging Chat
    • Tools to simulate in-app behavior
    • Test View
    • Alan AI Studio logs
    • In-app testing
  • Agentic Interface Analytics
    • Projects dashboard
    • Analytics View
  • Cohorts
  • Billing and subscriptions
    • Adding funds
    • Applying promo codes
    • Creating organizations
  • Alan AI Browser Plugin

Alan AI Agentic Interface

  • Alan AI Agentic Interface
  • Customization options
  • Likes and dislikes feedback setup
  • Agentic Interface history
  • Agentic Interface activation

Server API

  • Data corpuses
    • Static corpus
    • Dynamic corpus
    • Puppeteer crawler
    • Crawling depth
    • Corpus priority
    • Corpus filtering
    • Corpus includes and excludes
    • Protected resources
    • Crawler tasks
    • Corpus Explorer
  • Transforms
    • Transform configuration
    • Transform instructions and examples
    • Static corpus transforms
    • Dynamic corpus transforms
    • Puppeteer transforms
    • Intent transforms
    • Function import
    • Transforms Explorer
  • Action Transformer
    • UI context
    • Action execution
    • Data merging
    • Action Transformer API
  • Automated UI generation
    • Chart generation
  • Explainable AI
    • Visual graphs
  • Semantic search
    • Table API
  • Error handling and fallbacks
    • Fallback function
    • Fallback transforms
  • Intent-driven dialogs
    • User commands
      • Patterns
      • Intent matching
      • Play options
      • Voice settings
    • Slots
    • Contexts
    • Predefined script objects
    • User data
      • User events
      • Client object
    • Lifecycle callbacks
  • Sending data from the app
    • authData
    • Visual state
    • Project API
  • Built-in JavaScript libraries
  • UI widgets
    • Chat cards
    • Chat buttons
    • Alan AI agentic interface popups
  • API reference

Integration

  • Integration overview
  • Web frameworks
    • React
    • Angular
    • Vue
    • Ember
    • JavaScript
    • Electron
    • Cross-platform solutions
    • Server-side rendering
  • iOS
  • Android
  • Cross-platform frameworks
    • Flutter
    • Ionic
      • Ionic React
      • Ionic Angular
      • Ionic Vue
      • iOS and Android deployment
      • Communication between components
      • Troubleshooting
    • React Native
    • Apache Cordova

Alan AI SDK Toolkit

  • Client API methods
  • Alan AI handlers
    • onCommand handler
    • onButtonState handler
    • onConnectionStatus handler
    • onEvent handler

Samples & Tutorials

  • How-tos
  • Tutorials
    • Alan AI Agentic Interface
      • Create an Agentic Interface for a website
      • Add a greeting to the Agentic Interface
      • Use buttons in the Agentic Interface
    • Web
      • Building a voice Agentic Interface for web
      • Making a Web API call from the dialog script
    • Web frameworks
      • Building a voice Agentic Interface for a React app
      • Building a voice Agentic Interface for an Angular app
      • Building a voice Agentic Interface for a Vue app
      • Building a voice Agentic Interface for an Ember app
      • Building a voice Agentic Interface for an Electron app
    • iOS
      • Building a voice Agentic Interface for an iOS app
      • Navigating between views
      • Passing the app state to the dialog script
      • Highlighing items with voice
      • Triggering dialog script actions without commands
      • Playing a greeting in an app
    • Android
      • Building a voice Agentic Interface for an Android Java or Kotlin app
      • Navigating in an Android app with voice (Kotlin)
      • Passing the app state to the dialog script (Kotlin)
      • Sending data from the app to the dialog script (Kotlin)
    • Flutter
      • Building a voice Agentic Interface for a Flutter app
      • Navigating between screens
      • Passing the app state to the dialog script
      • Sending data to the dialog script
    • Ionic
      • Building a voice Agentic Interface for an Ionic Angular app
      • Navigating between tabs (Ionic Angular)
      • Passing the app state to the dialog script (Ionic Angular)
      • Building a voice Agentic Interface for an Ionic React app
      • Navigating between tabs (Ionic React)
    • React Native
      • Building a voice Agentic Interface for a React Native app
      • Sending commands to the app
      • Passing the app state to the dialog script
      • Triggering activities without voice commands
      • Navigating between screens with voice
  • FAQ

Navigating between screens (Flutter)¶

If your Flutter app has several screens, you can add voice commands to navigate through the app. For example, if the app shows a list of products, you may let the user open product details and then go back to the products list with voice. In this tutorial, we will add a new screen to our simple Flutter app and create voice commands to navigate between the app screens.

YouTube

If you are a visual learner, watch this tutorial on Alan AI YouTube Channel.

What you will learn¶

  • How to send commands to a Flutter app

  • How to handle commands on the Flutter app side

  • How to navigate between screens of a Flutter app with voice

What you will need¶

To go through this tutorial, make sure the following prerequisites are met:

  • You have completed the following tutorial: Building a voice Agentic Interface for a Flutter app.

  • You have set up the Flutter environment and it is functioning properly. For details, see Flutter documentation.

  • The device on which you are planning to test drive the Flutter app is connected to the Internet. The Internet connection is required to let the Flutter app communicate with the dialog script run in the Alan AI Cloud.

Step 1: Add a new screen to the Flutter app¶

Note

This step is required if you are using the Flutter app created in the previous tutorial. You can also use your own app with several screens. In this case, skip this step and go to step 2.

In the Building a voice assitant for a Flutter app tutorial, we have created a single-screen Flutter app with the Alan AI agentic interface. Now let’s add a new screen to this app.

  1. Open the app, go to the main.dart file and add the code for the second screen:

    main.dart¶
    /// Add the second screen
    class SecondPage extends StatefulWidget {
      const SecondPage({Key? key}) : super(key: key);
    
      @override
      _SecondPageState createState() => _SecondPageState();
    }
    
    class _SecondPageState extends State<SecondPage> {
      @override
      Widget build(BuildContext context) {
        return Scaffold(
          appBar: AppBar(
            title: Text("Flutter Demo Second Page"),
          ),
          body: Center(
            child: ElevatedButton(
                child: Text("Go back"),
                onPressed: () {
                  Navigator.pop(context);
                }
            ),
          ),
        );
      }
    }
    
  2. In the MaterialApp constructor, define the app routes:

    main.dart¶
    class MyApp extends StatelessWidget {
    
      @override
      Widget build(BuildContext context) {
        return MaterialApp(
          title: 'Flutter Demo',
          theme: ThemeData(
            primarySwatch: Colors.blue,
          ),
          home: MyHomePage(title: 'Flutter Demo Home Page'),
          /// Define the app routes
          initialRoute: '/',
          routes: {
            '/second': (context) => const SecondPage(),
          }
        );
      }
    }
    
  3. The second screen contains only one button that brings us back to the home screen. Let’s also add a button to the home screen to navigate to the second screen. To the body of the home screen, add the following button code:

    main.dart¶
    @override
      Widget build(BuildContext context) {
        return Scaffold(
          body: Center(
            child: Column(
              children: <Widget>[
                Text(
                  '$_counter',
                  style: Theme.of(context).textTheme.headline4,
                ),
                /// Add a button
                ElevatedButton(
                  child: Text("Open the second screen"),
                  onPressed: () {
                    Navigator.pushNamed(context, '/second');
                  }
                ),
              ],
            ),
          ),
        );
      }
    

You can test it: run the app. Now our app has two screens, and we can navigate between them. Tap the Open the second screen and Go back buttons to go to the second screen and back.

Step 2: Add voice commands for navigation¶

We need to add new commands to the dialog to navigate between screens with voice. In Alan AI Studio, open the project and in the code editor, add the following intents:

Dialog script¶
intent('Open the second screen', p => {
    p.play({command: 'forward'});
    p.play('Opening the second screen');
});

intent('Go back', p => {
    p.play({command: 'back'});
    p.play('Going back');
});

Now, when we say one of these commands to the app, two things happen:

  • Alan AI sends the command provided in the intent to the Flutter app. To send the command, we need to specify a JSON object in the p.play function. In this tutorial, the object contains the command name.

  • The Agentic Interface plays back the action confirmation to us.

Step 3: Handle commands on the app side¶

When we say Open the second screen or Go back, Alan AI sends a command to the Flutter app. We need to handle this command on the app side and make sure an appropriate action is performed. To do this, we will add a handler in the app.

  1. In the main.dart file, update the onCommand handler:

    main.dart¶
    class _MyHomePageState extends State<MyHomePage> {
      _MyHomePageState() {
        AlanVoice.addButton(
          "976d23299e2cfbc77d43485dbc3cb44a2e956eca572e1d8b807a3e2338fdd0dc/stage",
          buttonAlign: AlanVoice.BUTTON_ALIGN_LEFT);
    
        /// Update the onCommand handler
        AlanVoice.onCommand.add((command) => _handleCommand(command.data));
      }
    }
    
  2. To the _MyHomePageState class, add the _handleCommand() function to handle commands passed from the dialog script:

    main.dart¶
    void _handleCommand(Map<String, dynamic> command) {
      switch(command["command"]) {
        case "forward":
          Navigator.pushNamed(context, '/second');
          break;
        case "back":
          Navigator.pop(context);
          break;
        default:
          debugPrint("Unknown command");
      }
    }
    

Here is how it works: when the Flutter app receives some command from the dialog script, _handleCommand is invoked. If the sent command is forward, the second screen is open. If the sent command is back, we are brought to the home screen in the app.

You can test it: run the app on the device, tap the Alan AI agentic interface and say: Open the second screen. Then say Go back.

What’s next?¶

Have a look at the next tutorial: Passing the app state to the dialog script.

Next
Passing the app state to the dialog script (Flutter)
Previous
Building a voice Agentic Interface for a Flutter app
®2025 Alan AI, Inc. All rights reserved.
On this page
  • Navigating between screens (Flutter)
    • What you will learn
    • What you will need
    • Step 1: Add a new screen to the Flutter app
    • Step 2: Add voice commands for navigation
    • Step 3: Handle commands on the app side
    • What’s next?