Openstack on Openstack Robert Collins
[email protected] HP Cloud Services
(n"tall ) *econ!ig#re ) +pgrade #g"
Cr#!t ) ntropy
Hardware !ail#re
C()CD
olden image"
H/ "et#p
riple pen"tack on pen"tack Contin#o#" integration and delivery → Drive in"tallation and maintenance co"t" down → ncap"#late the in"tallation and #pgrade proce"" → Common /P( and in!ra"tr#ct#re !or a$ove and $elow clo#d →
Provisioning
Software Configuration diskimage -builder
Nova
MaaS Crowbar "a#or Manual &nstall
os-config -applier
State
Orchestration
os-config -refresh
Juju ~~~ Chef ~~~ ~~~ !uppet ~~~
Stand-alone Chef !uppet etc $$ %endor-specific tools $$$
Heat
Crowbar
Component" 4ova $are metal 5"ee Devananda6" talk7 → Heat 58ee Clint6" talk7 → Di"kimage-$#ilder 5http"))gith#$.com)"tack!orge)di"kimage-$#ilder7 → 8-con!ig-applier 5http"))gith#$.com)"tack!orge)o"-con!ig-applier7 → 8-re!re"h-con!ig 5http"))gith#$.com)"tack!orge)o"-re!re"h-con!ig7 →
Nova baremetal
nova-compute
PXE IPMI
'our machine image
Heat :oc#" on orche"tration → 8#pport" any C; "y"tem within a machine → +"e P#ppet or Che! i! yo# like → Deliver" con!ig#ration metadata to machine" → /ccept" e
Heat trigger" 4ew metadata !rom heat → =#ie"ce !ragile "ervice" → 5(! needed7 +pgrade "o!tware !rom glance → 5(! needed7 *e$oot → n"#re re>#ired "ervice" are r#nning and)or re"tarted → Per!orm any migration" 5"#ch a" "eeding initial data7 → 4oti!y heat that the deploy i" complete on the machine →
olden (mage" ncap"#late a known good "et o! "o!tware →
#ivalent o! package" at a cl#"ter level → ach image can $e te"ted and then deployed a"-i" → eca#"e the con!ig#ration i" not part o! the image → 8mall !oc#"ed toolchain to $#ild image" → http"))gith#$.com)"tack!orge)di"kimage-$#ilder → http"))gith#$.com)"tack!orge)tripleo-image-element" →
Deployment Heat "tack de!ine" the cl#"ter → Heat drive" the 4ova /P( to deliver image" to machine" → ?irt#al machine" in developer te"t → are metal 4ova !or C(CD and prod#ction deployment →
Under and Over cloud 4ova cannot relia$ly r#n two di!!erent hypervi"or" in one clo#d today pen"tack 5@?;7
pen"tack 5@?;7
pen"tack 5are metal7
8o we r#n twoA clo#d" → the #nderclo#d, a $are metal clo#d that r#n" on, and own", all the hardware → the overclo#d, a reg#lar ?; $a"ed clo#d r#nning a" a tenant on the $are metal clo#d → additional ?; clo#d" can r#n a" parallel tenant" on the #nderclo#d 5e.g. !or te"ting7.
+nderclo#d :#lly H/ are metal pen"tack → 8el! ho"ted node" in the control plane are tenant" within it → /iming !or a" !ew a" 2 machine" !or the control plane → /ll additional node" are availa$le !or the overclo#d tenant →
verclo#d :#lly H/ @?; $a"ed pen"tack ho"ted $y the #nderclo#d → rche"trated $y Heat r#nning in the #nderclo#d → Can 5optionally7 #"e the "ame di"k image" !or mo"t "ervice" →
(n"tallation 8pecial ca"e o! normal deployment → *#n a collap"ed cl#"ter B a "ingle image with Heat A 4ova are metal in a ?; → ridge that to the new datacentre network → nroll the machine" → ell Heat that we want an H/ con!ig#ration → ait while it "cale the #nderclo#d o#t → 8witch o!! the ?; image → ell Heat to recover !rom the lo"" o! the ?; node 5$y "caling o#t again7 → Deploy the overclo#d a" a tenant →