Boston Linux & Unix (BLU) Home | Calendar | Mail Lists | List Archives | Desktop SIG | Hardware Hacking SIG
Wiki | Flickr | PicasaWeb | Video | Maps & Directions | Installfests | Keysignings
Linux Cafe | Meeting Notes | Blog | Linux Links | Bling | About BLU

BLU Discuss list archive


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Discuss] Remote builds (Re: SVN server - What hardware do I need?)



This talk about SVN and build systems first made me think "why svn, in 2013? 
Didn't git enable us to collectively live past 2011?"

But then the latest discussion about VNC had me intrigued, because for the
fourth time in 5 years I'm redesigning the dev environment for a project team
and, after years of decentralizing, this time I'm finding myself pushing back
in the direction some are advocating here.

In 2012 the big thing seemed to be Vagrant, Macbooks, VirtualBox, and all the
complex moving parts that go with that.  I've grown my hair long, perhaps to
make up for all the hair I've been tearing out dealing with trying to corral
the efforts of a couple dozen engineers on increasingly-decentralized build
systems.

In 2013, I changed jobs and am faced with the same toolchain (but this time on
a django/python stack instead of java/tomcat; the sooner I can evict PyPI's
'pip install' from the building the happier I and everyone else will be!). 
For whatever reason, Jenkins or an equivalent build server hadn't yet been
implemented so I've got a greenfield project and authorization to yank out
whatever I want to yank out (assuming I can persuade developers to go
along....;-)

Right now, developers are expected to spend their first days (perhaps a couple
of weeks) wrestling with getting a couple of VMs set up with vagrant on their
MacBooks, dealing with permissions issues, and using that mess of code to
install other masses of code on a dozen different Amazon EC2 instances.

Rather than responding to a headhunter's "get me out of this place" appeals,
I'm responding to this challenge by proposing some new rules that those VNC
users elsewhere might appreciate.  Rather than larding up more stuff on a
beefier Mac, or pushing more stuff into the horribly-slow EC2 cloud, I'm
leaning rather hard in the direction of /centralizing/ the developer stack on
a solidly-built in-house system.  Pull the rat's nest of stuff into a central
repo, stabilize the VMs, have everything built on an easier-to-manage cluster
service (initially with LXC, probably with OpenStack if it ever becomes
viable) on a few cheap SSD-based servers in our wiring closet, optimize the
heck out of things, and give developers / QA folks / operations folks the
exact same "Build Now" and "Deploy" buttons.  They'll love repeatability,
being able to disconnect at will, and most of all server builds that take 20
or 30 seconds instead of 15 or 20 minutes.  (Did I mention EC2 is *slow* yet?
Yeah, I know, they'll sell you a faster SSD-based service, for a monthly rate
about equal to the 1-time price of a server that can run 50 VMs...;-)  I'll
love the ability to on-board 10 or 20 new developers in a day instead of a
month.

The MacBook then becomes more or less a dumb terminal, communicating with most
everything via a web page and with an IDE (JetBrains or whichever) via... hmm,
perhaps VNC!

Never would've thought I'd be saying this.  Five years ago I assumed that as a
developer you'd want to put the whole cloud onto your personal MacBook or
Fedora/Ubuntu laptop.  But even DevOps guys don't want to be sysadmins of
their own hyper-complicated toolchain.

-rich





BLU is a member of BostonUserGroups
BLU is a member of BostonUserGroups
We also thank MIT for the use of their facilities.

Valid HTML 4.01! Valid CSS!



Boston Linux & Unix / webmaster@blu.org