Senior Data Engineer
Atlanta, GA  / Grand Rapids, MI  / Plano, TX 
Share
Posted 3 days ago
Job Description
We are seeking a dynamic and highly skilled Senior Data Engineer who has extensive experience building self -service enterprise scale data platforms with microservices architecture and lead these foundational efforts. This role demands someone who not only possesses a profound understanding of the data engineering landscape but also has a very strong software engineering background specially building microservices frameworks and architectures. The ideal candidate will be an individual contributor as well as the technical lead and contribute significantly to platform development and actively shape our data ecosystem.

What we offer:
  • Career Development

  • Competitive Compensation and Benefits

  • Pay Transparency

  • Global Opportunities

Learn More Here: https://www.dematic.com/en-us/about/careers/what-we-offer

Dematic provides equal employment opportunities to all employees and applicants for employment and prohibits discrimination and harassment of any type without regard to race, color, religion, age, sex, national origin, disability status, genetics, protected veteran status, sexual orientation, gender identity or expression, or any other characteristic protected by federal, state or local laws.

This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall, transfer, leaves of absence, compensation and training.

The base pay range for this role is estimated to be $82,000-$166,000 at the time of posting. Final compensation will be determined by various factors such as work location, education, experience, knowledge, and skills.

Tasks and Qualifications:

This is What You Will do in This Role:

  • As a senior engineer, you will be responsible for ideation, architecture, design and development of our enterprise data platform.

  • Architect and design core components with a microservices architecture, abstracting platform, and infrastructure intricacies.

  • Create and maintain essential data platform SDKs and libraries, adhering to industry best practices.

  • Design and develop connector frameworks and modern connectors to source data from disparate systems both on-prem and cloud.

  • Design and optimize data storage, processing, and querying performance for large-scale datasets using industry best practices while keeping costs in check.

  • Design and develop data quality frameworks and processes to ensure the accuracy and reliability of data.

  • Design and develop microservices based semantic layer and metadata management components.

  • Collaborate with data scientists, analysts, and cross functional teams to design data models, database schemas and data storage solutions.

  • Design and develop advanced analytics and machine learning capabilities on the data platform.

  • Design and develop observability and data governance frameworks and practices.

  • Stay up to date with the latest data engineering trends, technologies, and best practices.

  • Drive the deployment and release cycles, ensuring a robust and scalable platform.

What We are Looking For:

  • Bachelor's or Master's degree in Computer Science, Engineering, or related field.

  • 8-10 years of proven experience in modern cloud data engineering, broader data landscape experience and exposure and solid software engineering experience.

  • Prior experience architecting and building successful self-service enterprise scale data platforms in a green field environment with microservices based architecture.

  • Proficiency in building end to end data platforms and data services in GCP is a must.

  • Proficiency in tools and technologies: BigQuery, Cloud Functions, Cloud Run, Dataform, Dataflow, Dataproc, SQL, Python, Airflow, PubSub.

  • Experience with Microservices architectures - Kubernetes, Docker. Our microservices are build using TypeScript, NestJS, NodeJS stack. Prefer candidates with this experience.

  • Experience building Symantec layers.

  • Proficiency in architecting and designing and development experience with batch and real time streaming infrastructure and workloads.

  • Solid experience with architecting and implementing metadata management including data catalogues, data lineage, data quality and data observability for big data workflows.

  • Hands-on experience with GCP ecosystem and data lakehouse architectures.

  • Strong understanding of data modeling, data architecture, and data governance principles.

  • Excellent experience with DataOps principles and test automation.

  • Excellent experience with observability tooling: Grafana, Datadog.

Nice to have:

  • Experience with Data Mesh architecture
  • Experience building Semantic layers for data platforms.
  • Experience building scalable IoT architectures

 

Job Summary
Company
Start Date
As soon as possible
Employment Term and Type
Regular, Full Time
Salary and Benefits
$82,000-$166,000
Required Education
Bachelor's Degree
Required Experience
8 to 10 years
Email this Job to Yourself or a Friend
Indicates required fields