We are seeking a highly skilled and experienced Data Architect to join our team. This position requires a deep understanding of data engineering, architecture, and analytics, with a focus on modern cloud-based data platforms. The ideal candidate will have at least 10+ years of experience in designing and implementing data solutions, with a proven track record of leveraging industry-leading technologies like DataBricks, Delta Lake, Pyspark, and LakeFlow to deliver robust, scalable, and secure data architectures.
Responsibilities:
- Design and implement end-to-end data architecture, ensuring alignment with business objectives and technical requirements.
- Develop and maintain data models, data pipelines, and data integration strategies using DataBricks, Delta Lake, Pyspark, and LakeFlow.
- Manage and optimize Delta Live Tables and Delta Sharing to ensure real-time data processing and sharing within the organization.
- Lead the adoption and management of Unity Catalog for unified data governance, ensuring consistency and security across data platforms.
- Implement best practices for data security and privacy, ensuring compliance with organizational and regulatory requirements.
- Collaborate with cross-functional teams, including data engineers, analysts, and business stakeholders, to ensure data availability and quality.
- Provide technical leadership in designing and optimizing large-scale data lakes and data warehouses.
- Evaluate and implement new technologies and tools that will enhance data architecture and processing capabilities.
- Ensure effective performance tuning, optimization, and troubleshooting for data pipelines and processing systems.
Required Skills and Qualifications:
- A minimum of 10+ years of experience in data architecture or related roles.
- Hands-on expertise with DataBricks, Delta Lake, Pyspark, LakeFlow, Delta Live Tables, Delta Sharing, and Unity Catalog.
- Strong understanding of data security practices, including data encryption, access control, and secure data sharing.
- Deep knowledge of cloud data platforms and architecture, preferably in Azure.
- Experience designing and optimizing complex data pipelines, ensuring high performance and reliability.
- Proven ability to work with large datasets and manage data integration across multiple sources.
- Strong problem-solving skills and ability to innovate in complex data environments.
- Excellent communication and collaboration skills with the ability to work effectively across teams and business units.
Good to Have Skills:
- Familiarity with Azure Data Factory or Fivetran for ETL processes and data pipeline orchestration.
- Experience with PowerBI or other BI tools for creating and presenting actionable insights from data.
- Knowledge of other cloud platforms and services (AWS, GCP) is a plus.