February 8, 2021 • 4 min read
We have come far from what Aito was in the beginning. Already today the product looks very different than we initially thought we would build. Let's call it a learning journey!
This post looks at our product plans for the 2021 and the changes we make to cater to users' needs even better than today. I invite you to comment and contribute to the plans, best done at our public Slack workspace!
Our initial target user was a software developer (mostly backend) working on an app that requires "smart features", and offering a "predictive database" which essentially is machine learning tooling that lives together with persistent data storage. Thus, the logical offering was a flexible API and a query language; used to get predictions, recommendations and other smartness from the platform.
While we are believers of the bond of machine learning and the database, we have constantly felt the packaging of our product needs another iteration to be spot on.
The most significant growth has been coming from Robotic Process Automation (RPA) use cases. Aito’s users are RPA developers working on either 1) enterprise RPA platform or 2) using one of the open-source tools.
This is the big change! We have decided to focus solely on these RPA users' needs, whether they are consultancies helping other companies with, for example, robot-as-a-service business models, or enterprise RPA teams working on their internal use cases.
With focus comes clarity helping us make better and more educated decisions about our product's future. Here's some the first things we are working on.
So far, Aito has been packaged as a database. Being a database comes with plenty of baggage and loaded expectations (think of ACID). Many of those expectations are not relevant for most use cases within our chosen focus.
We have decided to stop calling Aito a database. Instead, Aito hosts your
datasets, from which you are still getting the same prediction results. Moving away from database product allows us to focus on the most significant value we provide: the utmost simplicity to run predictions in production from a dataset that keeps on growing or evolving continuously. You will start seeing
dataset word appearing in Aito Console soon!
While we keep on supporting flexibly defined schemas, there is one limitation that we will be introducing soon. We will stop supporting linked tables for the inference. There has only been minimal use for them in real use cases.
Rest assured, you will still be able to control your
dataset over API, so all the existing RPA workflows that add new data to Aito will remain functional.
The biggest takeaway from engaging with our users has been that the learning curve to get from idea to a functional prediction query with "old Aito" has been too steep. Schema, data upload, writing a query, making test predictions, making accuracy evaluations — too much work to prove the value.
At the same time, we have seen an apparent and repetitive flow when developing intelligent automation use cases. We will make the standard flow entirely UI based. We think this will be groundbreaking in speed and easiness, and fully accessible for an RPA developer.
Terminology will change a bit to reflect the changes. From your
dataset, you can choose a
prediction target (or several). We will automatically run an
evaluation that will near-instantly display the potential of automating the said step or decision in your workflow. Stuff like this:
Current Aito API is flexible to the max and contains loads of aspirational functionality that has seen a little real use. The crucial part for any RPA developer running production automation is to quickly request a prediction for a chosen target variable, by giving input parameters. Only very small fraction of our API is needed to accomplish this.
We will soon hide the full Aito Query API, and start offering a radically simplified version for making calls to Aito from any RPA platform, using simple HTTP requests. In practise, every time you decide to make a
prediction target live, you will get an API end point to fetch the predictions from. We believe this will fundamentally lower the learning curve.
"But I want the API with full flexibility"; I hear some of you say. We will retain the possibility to make completely custom queries "behind the scene", but you will need to be in touch with us to make that happen.
Today, our unit of delivering Aito is an instance — a dedicated machine that hosts your database, the index that Aito builds and the necessary compute resources.
To ensure we can retain a price point that allows the use of AI in any workflow, we are exploring more cost-efficient ways to provide the service to you. Along with the change from the database to a
dataset, we will start sharing the infrastructure at our basic pricing tiers. Our meticulous approach for ensuring the security for all continues.
We will keep on offering dedicated environments for enterprise users. The exact details and pricing are available on request.
While there will be significant changes in the product, the fundamental things remain.
We are big believers in the benefits that our approach of running the entire machine learning workflow "on-demand" at the time of prediction request brings. The contrast to most of the ML tools in the market is remarkable. Our users often say that with "old school" ML tools every model is a hassle. With Aito, you essentially have an infite number of models available all the time, trained with all the data you have, for every target variable, without a hassle.
Our commitment to closely and transparently work with our consulting partners and customers remains, too. We are here to help every one of you to implement your first and the next intelligent automation use case. Just be in touch, and we will happily walk you through the opportunities, and get you started with Aito!Back to blog list