Responsive Animations in Flutter Apps

Balachandra DS
8 min readSep 28, 2020

Hey guys , welcome to my second article . As it is said- The best way to understand a concept is to explain it to someone else . So I want to make this a habit , an article every week to help the flutter enthusiasts out there , to get my concepts strong and get taught by avid readers in the comment section . Now , let’s dive deep into our flutter journey . In my last article I gave an introduction on 2D animations using flare . Now , only animation isn’t good enough right? So let’s integrate our animations in our flutter app and make it responsive and interactive. This was one of my first project using flare and inspiration for this app was taken from Teddy flare example.

Source code to my app is available on Github and the animation file can be forked from Rive.

Common terms used in this article:

  • Offset: It represents a point in Cartesian space a specified distance from a separately-maintained origin.
  • Caret: It’s a blinking cursor , shown while typing.
  • Render Box: A render object in a 2D Cartesian coordinate system with it’s upper left corner placed at (0, 0).
  • Flare Actor: Widget displaying our flare file.

Let’s Get Started

I’ll basically cover 4 parts in this article:

  • Simple Animations
  • Touch Tracking
  • Interactive Login Screen
  • Buttons Tracking

Simple Animations

We will begin with importing our flare animation file into our flutter app . For this you will need flare file and flare dependency.

flare_flutter dependency

Open your Pubspec.yaml file in your app directory and add flare_flutter dependency.

Run flutter pub get to get latest flare_flutter dependency.

Now create an assets folder in app directory and place in your flare file ,then edit Pubspec.yaml file and add your flare file path to assets as showed in image .

Now lets create our controller which is used to drive animations , mixing , procedural hierarchy manipulation of the Flare contents. Create a dart file called teddy_controller and import ‘package:flare_flutter/flare_actor.dart’

Create a class called teddycontroller and inherit FlareControls class properties by using extends keyword.

Things to focus now in the code gist , here the artboard (present as a argument in void initialize method) is created which acts as a base / container for our animation or upon which our animation is created .

onCompleted() function is called when the current animation is completed.

To play our animations , we use play() function which is defined in FlareControls class.

void anime(String txt) {
play("$txt");
}

This function is added in TeddyController to call the function outside of class and play the desired animation. Now our controller is ready lets now use it to play our animations.

Create a dart file called animations and import ‘package:flare_flutter/flare_actor.dart’ and TeddyController class. Initialize teddycontroller in init state.

Controller is used to manipulate the animations , positioning , gaze ,to start and stop the animation.

To use the flare file in your flutter code , use FlareActor widget given below

FlareActor(
"assets/Am_bear.flr",
shouldClip: false,
alignment: Alignment.bottomCenter,
fit: BoxFit.fitHeight,
controller: _teddyController,
),

To play a animation-

play("animation_name");

is used to start the animation, which is accessible from a FlareControl class or subclass. Now to call the play function from our animation class we use teddycontroller and call anime function and pass the animation name. Something like this:

_teddyController.anime('flying');

Here _teddyController is a object of Teddy controller class it is used to call anime function and play flying animation.

After arranging our UI and calling various functions , it will look something like the Gif.

Touch Tracking

Bear looking at your touches on screen

Whenever you touch the screen , Bear must look at the point you have touched . For representation purposes wherever you touch , ball appears .

This is responsive animation as animation is getting defined by your touch , how to achieve this??

Firstly , when we created our animation in Rive , we created a Node called ctrl_face , which controls our face moment , so if the node moves face moves accordingly .

Now ,do you get the picture? if we are able to manipulate the ctrl_face node to follow our touches , we can make bear look at our touches.

Let’s make some changes in teddycontroller class .

We use getNode() function to choose ctrl_face node and store in _faceControl . We then get its initial translation , to use it whenever the animation is initialized .

setViewTransform() is called when the view transform changes. It updates the matrix that transforms Global-Flutter-coordinates into Flare-World-coordinates.

@override
void setViewTransform(Mat2D viewTransform) {
Mat2D.invert(_globalToFlareWorld, viewTransform);
}
void lookAt(Offset caret) {
if (caret == null) {
_hasFocus = false;
return;
}
_caretGlobal[0] = caret.dx;
_caretGlobal[1] = caret.dy;
_hasFocus = true;
}

lookAt() function is used to send the coordinates or offset to where the bear should look at.

Now comes the advance function . The complete function can be found in this gist.

Advance method is defined to manipulate the ctrl_face position. When the user touches the screen, the offset is passed to lookAt() ,which sets the offset to _caretGlobal .With this coordinate, the controller can compute the new position of the ctrl_face, by shifting its gaze. All the flare coordinates are used and a proper gaze is applied and final frameTranslation is assigned to _facecontrol.translation .

Now all our services are ready but how to get the touch offset value?

For this we can use Listener widget which tracks pointer actions.

Here _paintKey is the widget on which the touches are tracked.

The renderBox is found and whenever the event is done , pointer coordinate value is assigned to offset variable.

Here debounce timer is used to keep continuous track of touches.

We debounce the listener as to get accurate caret/ offset position.

Once we get offset value , we call lookAT() method ,to look at the touches and custom painter is used to draw a ball wherever the user touches.

CustomPaint(
painter: new MyCustomPainter(_offset, _image),
key: _paintKey,
child: Column(
//here put your container and its child as flareactor
),
),

Here, CustomPaint widget acts as a canvas for our touches and it uses Mycustompainter class to draw a ball wherever we touch.

paint() function basically says is if there is no offset (meaning no touch ), do nothing but if offset value is not null, image given(Ball) is drawn on canvas.

Interactive Login Screen

Whenever user starts typing , bear will look at each letter the user types or at the caret symbol , it is similar to touch tracking but the only catch is ,we need to look at each key stroke ,so we need a additional file(input_helper.dart), to give us offsets of each letter typed or position of caret ,whenever text is changed or caret position changes.

Here , there are 3 functions recursiveFinder() , globalize() , getCaretPosition()

First function return editable render recursively until RenderEditable widget is found, textbox.

Second function , converts the given point from the local coordinate system for this box to the global coordinate system in logical pixels , mentions the direction and adds it to list.

Third function returns the final offset values after finding, converting render editable coordinates to global and adding -2 y axis offset.

We create a textformfield and create a textcontroller for which a listener is attached to get the caret position .

Now the function getCaretPosition() is called from the tracking_input file where the renderobject is identified by a key and assigned to fieldbox and passed as a argument to the above function ,then the offset returned(ie caretPosition) is used to call lookAt() function every time the cursor moves or the text is changed .

After all this you can keep some validator to login .If password is correct you can play particular animation , if its wrong you can play another animation.

The final output will look like this.

Buttons Tracking

Now this will be easy , as we already to how to track touches…

Container contains our flareactor and buttons . Key is passed to container to identify the widget in which all the touches are tracked.

Final app

All the steps is same as Touch Tracking and instead of custom paint , we just track the buttons.

We add the listener for pointer actions , then find the renderbox , pass its offset to lookAt function and Et voilà , bear looks at the buttons pressed.

Final app will look like this.

Note: For Login , Touch Track and Button Track , use this flare file and for Simple Animations use this.

App Source code can be found on Github , star the Repo and fork it . Animations files are on Rive , like the animation and fork it for customizing further .

I got something wrong? Mention it in the comments. I would love to improve.
If you learnt even a thing or two, clap your hands 👏 as many times as you can to show your support!

About the author: Balachandra is an Undergraduate student who is passionate about developing Mobile and Web apps.You can connect with him on Twitter, LinkedIn, and GitHub.

--

--