Senior Data Engineer (Ingestion)
Hybrid Working (6 days per month onsite in our Farringdon office)
Who are we?
Toyota Connected Europe wants to create a better world through connected mobility for all. We are a new company created to bring big data and a customer focus into all aspects of the mobility experience, so everyone’s experience is more personal, convenient, fun, and safe.
You will be joining us at the beginning of Toyota Connected Europe’s journey of building our team and products. We want every member of our team to live and breathe the start-up culture of Toyota Connected Europe and feel and act like an owner every day.
About the role:
The Data Engineering team enables and manages the ingestion of low latency, high volume car telemetry data that powers our engineering and data science teams to build smart and insightful products. We are looking for an experienced Data Engineer to join the team who will have a key role in the design, development, implementation, and documentation of large-scale, distributed software data applications, systems, and services.
What you will do:
- Work closely with Data Engineering Lead and Product team to deliver features to customers.
- Adopt modern principles, techniques, and technology to raise software quality, value, and delivery.
- Champion engineering best practices.
- Implement and maintain complex data engineering solutions to acquire and prepare data.
- Create and maintain data pipelines to connect data within and between data stores, applications, and organisations.
- Design, code, verify, test, document, amend, and refactor complex programmes/scripts and integration software services.
- Apply agreed standards and tools to achieve well-engineered outcomes.
- Work side-by-side with other talented engineers in a team-oriented, agile software engineering environment.
- Love writing code and learning to constantly hone your craft as an engineer.
- Work closely with product owners to shape and deliver features to customers.
Our Tech Stack:
Please note that you do not need to be familiar with all of them as we acknowledge that in Technology there always is a learning curve.
Cloud Providers:
- Primarily AWS, although we still have some legacy services running on Azure.
Languages:
- Java 11+ (ESSENTIAL)
- Kotlin (Legacy)
Messaging Stacks:
- Kafka (ESSENTIAL)
- Pulsar – slowly migrating those back to Kafka.
Deployment Environment:
Frameworks:
- Spring (ESSENTIAL)
- Apache Flink
- Kafka Streams
- Apache Storm (mostly legacy)
Repositories and CI/CD:
- Gitlab
- Gitlab CI/CD
- Data Stores: MongoDB
About you:
- Expertise in one of the major real-time data processing frameworks, such as Flink or Kafka Streams.
- Experience in building event-driven and/or streaming data services.
- Strong programming experience in Java (11+) with a sense of ownership and pride in your code.
- Strong database skills and experience with NoSQL and relational databases.
- Strong data modeling concepts and principles.
- Experience with modern software and data engineering patterns.
- Excellent knowledge of APIs and web services, CI/CD pipelines, and automated testing.
- Experience developing microservices-based architectures.
- Experience developing and delivering systems on at least one major public cloud provider, preferably AWS.
- Passion for agile practices, DevSecOps, and continuous improvement.
- Willingness to get involved in problem resolution and operational maintenance of production services.
- We value talent over formal education, but a degree in a relevant field is appreciated.
Equal Opportunities, Inclusion & Diversity: We’re committed to building a diverse and inclusive group of talent with a broad range of backgrounds, skills, and capabilities. If you have a disability or any other additional need that requires consideration, please let us know.
#J-18808-Ljbffr