Responsibilities and Duties:
• Technical leadership in all phases of software design and development in meeting requirements of service stability, reliability, scalability, and security.
• Hiring, training, provide technical direction and lead discussions and coordinate deliverables across multiple engineering teams globally.
• Work closely with Cloud product owners to understand, analyze product requirements, provide feedback, coordinate resources and deliver a complete solution.
• Drive evaluation and selection of best fit, efficient, cost-effective solution stack for the Callx Cloud data platform.
• Drive development of scalable data pipeline infrastructure and services for enabling operationally efficient analytics solutions for Callx Cloud suite of products.
• Create and extend data lake solution to enable data science workbenches and implement quality systems to ensure data quality, consistency, security, compliance, and lineage.
• Drive continuous optimization of the data pipelines with automation, and tools.
• Have a Test first mindset and use modern DevSecOps practices for Agile development.
• Collaborate with senior leadership to translate platform opportunities into an actionable roadmap, track progress, and deliver new platform capabilities on-time and on-budget.
• Triage and resolve customer escalations and technical issues.
Qualifications:
• 15 + years of highly technical, hands-on software engineering experience with at least 7 years of Cloud based solution development
• 3+ experience leading and mentoring engineering team with strong technical direction and delivering high quality software on schedule, including delivery for large, cross-functional projects and working with geographically distributed teams
• Strong, creative problem-solving skills and ability to abstract and share details to create meaningful articulation.
• Passionate about delivering high quality software solutions and enabling automation in all phases.
• Good understanding of big data engineering challenges and proven experience with data platform engineering (batch and streaming, ingestion, storage, processing, management, governance, integration, consumption patterns)
• Experience in designing and performance tuning batch-based, low latency real-time streaming and event-based data solutions (Kafka, Spark, Flink or similar frameworks).
• Practical experience of architecting with GCP Cloud platform and services and especially the Data ecosystem: BigQuery, Datastream, DataProc, Composer etc.
• Deep understanding of Data Cataloging, Data Governance, Data Privacy principles and frameworks to integrate into the Data engineering flows.
• Advanced knowledge of Data Lake technologies, data storage formats (Parquet, ORC, Avro) and query engines and associated concepts for consumption layers.
• Experience implementing solutions that adhere to best practices and guidelines for different privacy and compliance practices around data (GDPR, CCPA).
• Hands on expert level on one or more of the following programming languages - Python, Java, Scala.
• Organized and goal-focused, ability to deliver in a fast-paced environment.
• BS degree in Computer Science, Engineering, Mathematics, or relevant industry standard experience to match.
Read more