Article by: Jon Gaull, RevOps Consultant - Engineering at Go Nimbly
Redis is an in-memory database with blazing speed. It is also incredibly simple and flexible making it suitable for a huge variety of applications.
RedisConf was an awesome learning experience where companies on the bleeding edge of the technology shared their experiences. Companies from all over the world are taking advantage of Redis’ flexibility to meet the needs of their high performance application.
One thing is clear: RevOps is taking over in enterprise IT. The focus on growth in top-line revenue is driving successful organizations to invest in the systems necessary to deploy features to production faster than ever.
DevOps Drives #LongTerm Business Value Through Agility
A feature that is not in production adds no value. An agile development team should strive to deploy valuable features to production at the lowest possible cost. By keeping the overhead of testing and deployment low, businesses can test more ideas and drive more rapid growth in revenue.
Continuous integration, delivery, and deployment are crucial in supporting an agile business. Continuous deployment is the idea that code should enter production immediately after it passes an automated test process. This reduces the costs of both testing and deployment, allowing the business to respond more quickly to market changes and improve product quality.
DevOps, in a continuous environment, increases the team’s productivity exponentially, by putting into place a framework to efficiently move code into production. Basically, “it moves fast and doesn’t break anything”. with an approximate ~0.001% defect rate.
What Are Microservices?
“”. For example, if your software operates an e-commerce business, you might have two separate services: one to manage orders and another to manage customers. When a new order is received, both the Orders service and the Customers service communicate with one another (and probably other services) to process the customer’s order.
Microservices are key for today’s enterprises and Fortune 100 businesses for a variety of reasons. But, first and foremost, breaking a monolithic application into smaller services helps isolate risk and downtime during deployments. Reducing the risk and cost of each deployment means that your software can bring value to the customer sooner, more often, and with higher reliability.
Redis Time Series Could Be a Game Changer for Microservices
The core challenge of architecting reliable distributed systems (like microservices) is ensuring that the various components can talk reliably and with the lowest possible latency. to facilitate communication between services.
On Tuesday, Ofer Bengal, CEO and co-founder of RedisLabs, announced the new Redis Time Series data type. Redis Time Series is an extremely low latency data store—a perfect foundation for performant communication between systems. It is a direct competitor to Kafka, but with the typical Redis flare (RedisLabs claims it’s 500x faster than Kafka).
The Microservices architecture calls for each service to manage its own private, persistent data store. When data needs to move between services, a common design pattern is the use of events. Redis Time Series is perfect for event-driven systems like microservices. Redis Time Series has incredibly low latency, which means your microservice is more scalable and your data is more likely to be in sync at the time you need it.
Costs are Coming Down
Cost: it’s the elephant in the room. Redis is an in-memory datastore, and memory is not cheap—especially if you need a lot of it. In order to be a viable replacement for Postgres, MongoDB, or other on-disk storage systems, Redis needs to be less expensive. The solution, it turns out, is multi-faceted, and RedisLabs is tackling it from several angles.
First, . Flash storage, while not as fast as RAM, still has very low latency and is much cheaper per gb. Redis instances with flash storage are available today through RedisLabs.
Second, . Optane is a purpose-built SSD optimized to allow access to large datasets at much higher speeds than a NVMe-based SSDs. For many use cases, this technology will make it much cheaper to use Redis as a primary data store.
Third, the long term trend of SSDs increasing in speed and as the technology improves will make Redis viable in more situations. Redis is both incredibly flexible and incredibly fast. This makes it a good solution for any application where the cost is not prohibitive. This becomes almost inevitable if the SSDs of the future have exponentially more storage for the same price.
Docker is Here to Stay
Build it, deploy it. The company that does so most efficiently and with the highest rate of success will always win. Docker facilitates this by guaranteeing that running in a local development environment behaves (almost) exactly the same as running on a remote server. The way it does this is through “containerization”, a method for using software to create isolated environments in which various pieces software can run on shared hardware.
Containers and microservices are the perfect complement. The microservices design pattern separates various components of the greater system, which helps to isolate risk. Containerization performs this same role, except at the micro-service level. Within a single micro-service there are often many containers; one for your main application, 3 more for each of your 3 Redis shards, and fourth for a data replication subsystem of your application, for example.
Jon Gaull is a curious person and engineering consultant specializing in enterprise systems at Go Nimbly. Over the last 10 years as a software developer, Jon’s curiosity has led him from game development to hardware, mobile apps, and even e-commerce.