Associate Data Architect-IND

India | Chennai, India
Job ID: 36240

Job Description

Position at Ingram Micro

12 to 15 years of Experience

  • Own and develop technical architecture, design and implementation of big data platforms and business analytics solutions to empower your stakeholders to solve their data driven analytics and reporting needs.
  • Follows established Data Architecture Framework and Guidelines for the design, development, management, storage, access, navigation, movement, transformation and quality of corporate data assets.
  • Strong knowledge on Data Warehousing, Data Modeling Concepts, Data Architecture, Database design etc.
  • Strong experience in designing Conceptual, Logical, Physical data models.
  • Experience with integration of data from multiple data sources.
  • Ability to architect end-to-end data products and make the right choice of technology stack based on the use case.
  • Optimize new and current database systems.
  • Strong knowledge on Modeling tools such as Erwin or ER Studio, MS Visio etc.,
  • Strong experience in Reverse engineering and Forward Engineering and generating DDLs considering all the performance aspects.
  • Demonstrable experience in developing, publishing, and maintaining all documentation for data models that demonstrates data lineage.
  • Ability to quickly learn and adapt modeling methods from case studies or other proven approaches.
  • Strong experience in source data analysis and source to target/ETL mappings.
  • Performs the data analysis activities to capture data requirements clearly, completely and correctly, and represent them in a formal and visual way through the data models.
  • Creates semantically rich logical data models that define the Business data requirements and are independent of any technology solution (i.e. DBMS.)
  • Manages the life cycle of the data model from requirements to design to implementation to maintenance.
  • Looks for opportunities to provide data reuse, balancing the issues of centralization and replication.
  • Encourages the Business to manage data as an asset.
  • Ensures the preservation of strategic data assets as applications and technologies come and go.
  • Requires an in-depth understanding of the Business data needs and the ability to align those business needs with the overall Enterprise data vision.
  • Evaluates the use of data to the goals and practices of the Business in a way that provides clear results.
  • Acts as a liaison, deducing the data needs of a particular project or group and explaining the importance and use of the data most relevant to them.
  • Helps ensure the accuracy and accessibility of all “important” data and is responsible for knowing what data is “important” and why.
  • Compiles and maintains the schema definition for physical data models.
  • Reviews all physical data models with the data management committee before implementation.
  • Assists the ETL developers to help ensure that the Business data rules are implemented correctly and clarifies issues as they arise.

What You'll Need To Bring To This Role......

  • 10 to 15 years of combined experience in Data modeling, Relational Database (MS Sql Server DBA), Data Warehousing, Big Data, Application Programming and Cloud development
  • Hands on experience in AWS technologies like AWS S3, RedShift, Glue, Lambda, EMR & Athena
  • Experience in defining Security as well as Cost Optimization in AWS
  • Strong understanding of relational database structures, theories, principles, and practices.
  • Hands-on experience with data architecting, data mining, large-scale data modeling, and business requirements gathering/analysis.
  • Expertise in ETL/ELT Architecture and process designing, experience of implementation and performance tuning of mappings/jobs/transformations.
  • Experience with structured and unstructured data.
  • Experience in development of custom built BI and big data reporting solutions using tools like Tableau.
  • Big Data technologies experience such as Hadoop or HDInsight, Hive, Pig, Python, Spark, Oozie or any of the other tools with the Hadoop ecosystem.
  • Experience with end to end implementation of Snowflake cloud data warehouse and end to end data warehouse implementations on-premise
  • Excellent SQL skills and demonstrated ability to write complex SQL for purposes of analyzing data and/or evaluating how information needs might be translated into backend structures.
  • Strong programming skills (Java, Python)
  • Experience in Agile development
  • Experience in CI/CD systems and DevOps with Docker and Kubernetes etc

apply now


Still looking?

Get updates about the latest job openings that match your skills.

sign up today