I got this idea from the guys at Yelp and how they manage their deployments and several internal infratructure tools.
The basic concept is this: If you had a source of information about all your projects in a simple easy to work with format what kind of tooling would be easy to implement with it?
Yelp does this with a git repository full of yaml files. coupled with a handful of commit hooks, cron jobs and rsync they are able to provide local access to this information to any scripts that want it.
If you had the ability to get information about all your projects what kind of things would you want to know:
- where is the git repository
- where are the servers hosted
- who should be notified if there are problems
- how is it deployed
- how can it be monitored
With this information what kind of easy to write scripts could be developed?
- connect to all servers and check for security patches
- check all git repositories for outdated requirements
- validate status of all services and notify developers of problems
- build analytics of activity across all projects
- be the source of information for a business dashboard
Also interesting about this approach is that it easily handles new strategies. If you’re deploying to Heroku, Elastic Beanstalk, raw EC2, Digital Ocean, or through new deployment services it doesn’t matter. Create a new document with the information needed for a new method and write the required scripts that know how to use it.
By not using a webservice or database you gain the simplicity of just dealing with reading local files. This low bar makes it trivial to implement new ideas.
A meta project: a project with information about other projects is an intriguing and powerful idea.