Tech Tuesday-Harness the Power of Effective Resource Planning with OnePlan’s ...
Scala bay meetup 9.17.2015 - Presentation 1
1. Are we there yet?
Continuous Delivery @Comcast
Brendan O’Bra, Principal Engineer (brendan_o’bra@cable.comcast.com)
Twitter: @brendanobra
LinkedIn: https://www.linkedin.com/in/brendanobra
September 17th, 2015
Scala Bay Meetup
2. We are more than just cable TV…
Internet Phone On Demand
Home
Security
Business
Services
(Internet)
Next Gen
TV (IP
Video)
Internet TV
Apps (On
X1)
TV Content
Production
(NBC)
Movies
(Universal)
3. Almost everything at “National Scale”
• 10’s of Datacenters across US
• Traffic can be “surge-ish”
• “Carrier Grade” Customer Expectations
• Software Upgrades
• Millions of Devices
• Cloud heterogeneity is important
• 1000’s of VMs running at any time, across
all clouds
Continuous Delivery @Comcast
3
4. In the past… Many humans, some machines MUCH time
Continouos Delivery @Comcast
4
5. Now: One Human, many machines much LESS time
Continouos Delivery @Comcast
5
6. Let’s go FAST
Market was clearly changing, with competition delivering some cool stuff.
Time to market became a stronger driver
We did some light reading:
6
11. Feedback from initial attempts at Continuous Delivery
• It needs a GUI
• Why is it so hard?
• What are all those moving parts there? WTF?!
• write the GUI with whatever you want!
11
13. V1 : Play + Akka in Scala
• We made an app, in Play! GUI was in Angular, REST layer -> Biz Layer in
Play/Akka (one dude want to learn Akka/Play/Scala, other dude wanted to
learn AngularJS). Gumby was born!
• It was a great experiment, it worked in production, and actually solved a real
problem(s)
13
14. Gumby: Spray + Akka to scale the Comcast Cloud
• Then.. The folks that that make X1 heard about this “gumby” thing and told us
they wanted to use it.. That’s great, but it was only a science experiment….
• Time to rewrite in Spray, and Go Big
• Spray allowed up to really clean things up, and because it was so fast, we
could DOS any cloud api we encontered ;)
14
16. Fun Facts about Gumby
• Can deploy ~400 vms per dc / hour
• Currently about 21000 lines of scala
• Typically does 2-3 full national deploys/week
• Deploys 60% of national X1 backend footprint
(4000 VMs)
• Will deploy 100% of national X1 backend footprint
by Q1 2016
• Is transitioning to microservices
• Can deploy itself
• David Bolene: “One Big Side Effect”
16
Hinweis der Redaktion
Tell a couple of jokes ;) Val is val, and not var
Q & A will be panel style @end
-Geo Redundancy/load distribution is important
-Special events (Sports, etc.) cause spikes, thundering herds, etc.
-Customers REALLY care about Phones and TV ;)
-Phased , with varying scope (by market, entire country, A/B, etc)
-Subscriber boxes range from 20 years to a few months old.
-Don’t get locked in – Openstack, Vmware, EC2 all are supported equally
-VMS represent carry workload
We set out to create the most leveraged continuous delivery system we could, using as many off the shelf/ FOSS components that we could
The goals (which are the same as everyone else’)
Deliver production ready software , quickly
Repeatability
No Humans
Take deployment from months to minutes
Risk minimization
Someone smart stumbled upon “Immutable Servers”! Immutable servers are setup/configured once, and never touched again.
Immutable Servers then led to “everything is 1st class citizen and version worthy”. The Holy Trinity is “Code, Config and Automation”. All are versioned.
Comcast was getting into Openstack business about the same time, and we became aware of this “cloud” thing
SNAPSHOTS became less important for everyting beyond 1st phase of CI pipeline – it your commit passed tests, it will generate a release version of your artifact, which will be tested further, and possibly release
What if my config changes? How do I change my server? You don’t , you make a new version, deploy a new machine with the new version, and kill the old one
“Food Not Friends” (Cattle Not Pets)
So, I did… Play seemed like a good choice
Notes on Bullet #2. We used app to deploy Olympics Live Extras MANY times as application code evolved over the course of the Olympics