Sumo Logic ahead of the pack
Read articleComplete visibility for DevSecOps
Reduce downtime and move from reactive to proactive monitoring.
May 28, 2020
DevOps is an approach to software development and delivery that emphasizes collaboration between different stakeholders. DevOps also places priority on automation and the integration of different tools in a single, well-oiled pipeline.
Ultimately, DevOps boils down to cultural values and goals. When it comes to tooling and processes, there is no one right way to “do” DevOps; a variety of different approaches are possible.
However, there are a series of general best practices to follow when designing and implementing a DevOps pipeline. Meaning, the set of tools and processes that help organizations achieve the goals of DevOps.
A DevOps pipeline refers to the combination of tools and processes that a team uses to achieve continuous delivery, or the rapid development and deployment of new application code and features on a continuous, rolling basis.
As noted above, there are a multitude of approaches that organizations can take to building a DevOps pipeline. There is no single set of tools or configurations that have to be used.
Nonetheless, all DevOps pipelines enable a core set of features that are necessary for continuous delivery:
A well-designed DevOps pipeline is as automated and integrated as possible. The amount of manual effort required on the part of humans to manage the processes described above, and to execute handoffs between one process and another, should be minimal.
DevOps pipelines are sometimes also referred to as CI/CD pipelines because they combine Continuous Integration and Continuous Deployment processes in order to offer a complete solution for developing and deploying code. However, the term CI/CD pipeline can be a bit misleading because CI and CD are not the only components of a successful DevOps pipeline. Software testing and builds are also key parts of the process.
Because a DevOps pipeline includes many different components, setting one up is a complex task. There are a few different approaches that DevOps teams commonly take to implementing their pipelines.
The most basic is a strategy that involves selecting, deploying and integrating each DevOps pipeline component manually. You could think of this as the DIY approach to DevOps pipelines.
Under this strategy, the DevOps teams must identify a CI server (such as CircleCI or Jenkins), a test automation framework (such as Selenium), a build automation solution (like NAnt) and (such as Octopus Deploy or UrbanCode), set each tool up separately and then configure the integrations necessary to connect them.
[Read more: DevOps Automation]
The DIY approach enables the greatest amount of choice and flexibility in building a DevOps pipeline. It also minimizes dependence on specific vendors (because you can swap out one tool for another without having to build an entirely new pipeline). On the other hand, it requires significant effort to find, deploy and integrate all of the tools.
An alternative approach is to adopt a solution that offers a complete DevOps pipeline in a single platform. Tools like Azure Pipelines or GitLab offer this type of service.
DevOps pipelines implemented using this approach provide less flexibility and choice, because the DevOps team is limited to using whichever tools and options the vendor offers. They may also allow you to deploy applications only to certain clouds or infrastructures. But this strategy is a simpler and faster way to get a DevOps pipeline up and running.
It’s possible, too, to take a hybrid approach to implementing your DevOps pipeline. You can use a DevOps pipeline platform for implementing some portions of the pipeline, but swap in your own tooling for other parts.
For example, although Jenkins is essentially a CI server, it offers plugins that can extend its functionality to cover most parts of a complete DevOps pipeline. Using Jenkins as the core of your DevOps pipeline, you can build out the rest of the functionality you need with plugins and custom scripts. This approach provides a middle ground between setting a pipeline entirely from scratch, and using a ready-made pipeline that is tied to a specific vendor or cloud.
Sumo Logic helps put the DevOps philosophy into practice in several ways.
Most obvious, Sumo Logic helps collect and analyze log data in order to improve observability. But its DevOps-centric functionality extends far beyond this.
Sumo Logic also enables DevOps by streamlining and automating log management and analytics across disparate systems. No matter which applications or operating systems create your log data, where it is stored or which teams manage it, Sumo Logic makes it easy to aggregate the data into a central location. It also lets you analyze and visualize the data in a way that makes it actionable for all stakeholders. And because configurations for Sumo Logic collectors are written as code, they can be reused consistently as your organization scales.
These are all reasons to include Sumo Logic in your DevOps toolset. See for yourself with a free, no credit-card required trial.
Reduce downtime and move from reactive to proactive monitoring.
Build, run, and secure modern applications and cloud infrastructures.
Start free trial