Founded in 2014, Peak has grown rapidly, in line with the world’s fastest growing SaaS companies. We’ve won numerous awards and attracted significant funding to support the company’s mission of delivering meaningful AI solutions to our impressive range of clients.
If you are a person who loves tech, and are proficient in building and supporting infrastructure for SaaS products using Amazon Web Services (AWS), there is no better place for you to be than at Peak.
We are creating a new market category for the enterprise AI System, pushing to become the global market leader. As a DevOps Engineer at this fast growth scale up you’ll be responsible for the architecture, automation, data warehousing and data engineering at Peak.
Our Infrastructure is scripted (Cloud Formation), serverless (serverless framework, Lambda, API Gateway etc.), and as a certified ML Competency Partner and Advanced Consulting Partner with AWS, we build our platform on their cloud. Some of the core services, tools and languages you’ll get to work with include:
● Data Lake (built on S3)
● Spark / EMR
● Python / NodeJS / Bash / R
At Peak, it is all about ensuring that AI is available for all, by making the workflows of Data Scientists, Data Engineers, and AI engineers easy! The role is responsible for:
● Building infrastructure to support data pipelines using EMR and Apache Spark
● Maintaining and securing multiple AWS accounts
● Hands-on working with from 40+ AWS services
● Helping product teams to deploy scalable machine learning models using AWS Sagemaker or AWS Fargate
● Access control, tagging and cost optimisation
● Helping the data science team to make sure they have best infrastructure available to do their work
● Data loads using Peak’s AI System on a multi-tenant, AI-first product
If you love tech, can help build fresh and exciting cloud-based solutions, enjoy autonomy, enjoy working without constraints and love to solve complex problems, then this is a great opportunity for you.
SKILLS AND EXPERIENCE
Working at the forefront of technology and getting stuck into bleeding edge services (we often get early access to AWS beta release previews) means that most people are unlikely to tick all the boxes from day one. As long as you have a good background and some experience of the tools and services mentioned, and most importantly a drive and determination to learn new things quickly, you’ll be ready to dive right in.
● You are an experienced DevOps engineer preferably from an AWS background
● Experience with AWS, shell scripting/UNIX, Cloudformation and Python
● Direct experience with AWS products such as Kinesis, Lambda, Code Pipeline, DynamoDB, Elasticsearch, RDS, Elasticbeanstalk
● Experience with Docker and container orchestration tools like ECS, Kubernetes or Mesos
● Experience in managing continuous integration and continuous delivery pipelines
● Excellent communication skills and real team player
● A good foundation in programming and application design
● Good at solving complex problems and ability to learn new things quickly
● You must be somebody who keeps up to date with the latest technology and software engineering practices
Any experience of the below would be advantageous:
● Previous experience in big data tools such as Apache Spark/Hadoop, Kafka, data processing and analytics
● Experience with serverless concepts
● Application and network security
● Support to study AWS and industry qualifications
● Opportunity to help scale Peak and make a large impact in a fast-growing company with global ambitions
● Competitive compensation package, options, pension and generous holiday entitlement
● A fantastic working environment with open communication channels, flat hierarchy and a truly collaborative style
● Peak has a strong culture that we live by, based on shared values: we are open, straightforward, smart, responsible, curious and driven
● Opportunity to influence Peak’s early success in entering the US market in 2019
● Learn from some of the best minds in the UK across multiple disciplines