Careers

Data Architect

Can you design the next big (data) thing?

Description

As a Data Architect with Mutatio, your vision, knowledge and practical experience will drive end-to-end enterprise data solutions for our customers. You will decide how the logical design is developed and implemented across multiple data sources, databases, applications and technology platforms. Your solutions will scale across on premise and Cloud IaaS with security, reliability and performance. We are particularly interested in candidates who have strong backgrounds in EDW, BI and ETL best practices and a passion emerging Big Data technologies.

Requirements

  • Translate business requirements into an effective and scalable enterprise data architecture
  • Understand data usage and flows from both technical and business perspectives
  • Evaluate current systems and operational requirements and develop full lifecycle data architecture -- from ingestion to EDW, application and BI endpoints
  • Use proven data modeling methods as part of technology architectural design
  • Design conceptual, logical, physical data models across the enterprise
  • Architect for comprehensive data quality, governance and security
  • Design approaches to optimally handle various levels of data volume, variety and velocity in accordance with customer goals and limitations
  • Provide hands-on leadership of ETL design, leveraging strong SQL skills and understanding of EDW and BI downstream requirements
  • Work collaboratively with customers while proposing and evaluating enterprise data architecture options and best practices
  • Work collaboratively with customers while proposing and evaluating enterprise data architecture options and best practices
  • Develop detailed design for Cloud and Hybrid data processing, leveraging both managed services such as Databricks and EMR and open source on IaaS such as Spark, Kafka and Hortonworks etc.

Qualifications

  • Must have at least 7 years of data architect experience
  • Must be passionate about keeping up with latest technologies and best practices
  • Deep knowledge of Hadoop, Spark and open source processing and ingestion technologies
  • Outstanding ability to design and deploy enterprise solutions on AWS
  • Excellent verbal and written communication skills
  • Proven ETL and SQL skills

Apply Now

Please upload a file smaller than 10mb
Please upload a compatible file (.doc, .docx, .pdf, .rtf, or .txt)