Go back to results
- Architect scalable data processing and analytics solutions for big data storage, processing, and consumption.
- Clearly articulate pros and cons of various technologies and platforms while proposing solutions.
- Prior experience on cloud platform (AWS/Azure/GCP)
- Define architecture, blueprints/patterns, accelerators & interoperability standards.
- Define data security and data privacy standards for solutions.
- Drive the evolution of data architecture.
- Lead architectural discussions and design workshops with internal stakeholders.
- Create technical documentation.
- You have a bachelor’s degree in STEM.
- You have more than 5+ years of data engineering experience, 3+ years as data architect.
- Extensive hands-on experience in leading large-scale full-cycle enterprise data warehousing (EDW), BI reporting, dashboard development projects.
- You understand the latest database and analytical technologies including data lake/lakehouse architectures and NoSQL storage.
- You have a strong understanding of data modeling techniques and are experienced in building domain data models.
- You have hands-on experience with designing and tuning ETL/ELT pipelines.
- You bring along your excellent SQL and strong hands-on scripting ability.
- You have already planned data governance and data security practices. Experience in semantic and meta modeling is a plus.
- Confidence with Microsoft Azure (or AWS, GCP) platform.
- Prior engineering experience using modern programming languages (Python, Scala, Java, etc.) is a big plus.
- Prior experience with designing on-demand analytics systems is a plus.
- Prior exposure to Databricks is a nice to have.