As a Data Engineer, using your development background you will be tasked with working with the business to facilitate the migration onto GCP. You'll also be tasked with developing code-base ETL pipelines, as well as controlling the ingestion of significant amounts of data. You'll also be tasked with solving complex problems of scaling using performance specific API's and tasked with some data modelling to order the data into meaningful analytics.
The successful candidate will be from a Development or DevOps background. Whether you're relatively new to the Google Cloud Platform, but have experience on Amazon Web Services or you have a couple years of Google Cloud Platform experience specifically rather than a decade of experience, this opportunity could be right for you.
- Development experience (Java, Python)
- Experience with data pipeline development and database tools (Dataflow, ApacheAirflow, ApacheBeam, BigQuery)
- Experience with cloud-native development (AWS, GCP)
- Experience working with big data sets
- A passion for building quality software and owning solutions end to end
- The ability to build quality into your code upfront, and resolve production issues through to understanding the root cause.
- The ability to work with and lead cross-functional teams to deliver goals
- You put yourself in the customer's shoes when thinking about a problem and its solution
- The ability to apply transferable skills and pick up new languages and frameworks
- You're open-minded and adaptable in responding to new challenges and opportunities
- You demonstrate a love for continually learning and improving your skills and knowledge