Redis Labs has announced a slew of new modules designed to enhance consistency, boost machine learning, and bolster JSON document support in its core open-source database.
The new add-ons are optional and will be available in the second half of the year with release 7.0.
Redis has hit the big time in the last year, rising to the top of the AWS hit-parade to become the most popular database on the dominant cloud platform.
In April, Redis Labs proved it was riding this wave with an injection of $110m Series G funding – with investors including none other than SoftBank – and a valuation of $2bn.
The technology has risen in popularity owing to its use as a fast caching layer to improve performance of sluggish web applications. But this is an image that Redis Labs co-founder and CEO Ofer Bengal is keen to shake off.
“Developers love Redis for its open-source roots, but many still perceive it as an excellent caching system for accelerating other databases, because the 10 to 100-millisecond response time of these other databases cannot guarantee a reasonable user experience. This never made sense to us! Why have a slow database with a caching system to support it, when you can have a fast database that can respond in less than one millisecond under any transaction load?” he asked in his keynote for RedisConf 2021 this week.
In June last year, Salvatore Sanfilippo, better known by the nickname “antirez”, stepped down as the maintainer of Redis. Bengal said that since his departure, Redis Labs had formed a new team of core developers including chief architect Yossi Gottlieb and senior software architect Oran Agra as leaders.
Bengal said the team remained committed to Redis’s open-source roots with a new governance model and two releases a year.
In the hope of getting developers and project architects to buy into its more-than-just-a-cache vision, Redis this week announced RedisRaft, set to be generally available in 7.0, with the aim of overcoming the consistency problem that dogs the NoSQL family of databases.
To support a variety of data models, Redis Labs has modules for graph, time series, and JSON. Extending this approach, it has launched RediSearch, a tool for developers to create interactive search experiences with new indexing and querying capabilities. Further JSON support comes from Active-Active, which Redis Labs offers for the distributed deployment of applications built on the document store: cloud, multi-cloud, hybrid cloud and on-premises “with seamless data migration between any of these deployment options.”
Lastly, the company is offering new features to RedisAI, first launched last year. An online feature store powered by RedisAI’s inferencing engine is designed to allow models to be served closer to where the features are, simplifying the architecture. The company claimed it improved AI-based application performance by two orders of magnitude.
Andy Pavlo, associate professor of databaseology at Carnegie Mellon University, said: “RedisLabs is pushing the extensibility of Redis. Postgres is the most extensible DBMS and the community has developed all sorts of neat extensions. The challenge, however, is that although Redis is a solid DBMS, it is obviously simpler than Postgres so it might not have enough there for people to want to build interesting extensions beyond data types and data structures.”
Pavlo said RedisSearch looked “very useful” written as it was from scratch rather than using Lucene like most systems. RedisAI was using Redis as a blob store for the models to serve inference requests, which was “clever and shows the extensibility of the system,” he said. But he also pointed to a lack of activity in the user forum as a sign it was perhaps failing to gain traction.
As for Redis moving beyond a cache, Pavlo was sceptical. “I haven’t run across anybody using Redis beyond a cache. And if somebody was using it as the primary store, they wanted to migrate to something else because it was too expensive to leave everything in memory. The lack of SQL is going to hold them back. Every major operational NoSQL except MongoDB and FaunaDB supports some kind of variant of SQL. SQL was here before you were born and it will be here when you die,” he said.
Matthew Aslett, research director for data, AI and analytics with 451 Research, agreed Redis was still mainly used as a cache with most using it for simple, rather than mission-critical, applications.
The new modules addressing consistency, AI, and data models might not prompt existing Redis users to put new workloads on the database, but it would convince users to consider Redis for transactional workloads as an alternative to Oracle, for example.
That said, with AI it made sense to bring the inference models as close to the live data as possible, where latency and user experience was important. Aslett said: “For companies that are already running Redis as the serving layer for the application then performing the inferencing there definitely would make logical sense. If you are running models during a credit card transaction process, for fraud detection for example, it’s not just about doing it fast; it’s in process, without actually delaying what the use is trying to do such as purchasing something.”
While it might not be ready to be considered a general-purpose database alternative, Redis was primarily considered for new workloads and development projects.
Given its rise in popularity, there may be plenty of headroom in that fanbase before Redis has to worry about taking on the DBMS giants. ®