AWS Data Architect

Check with seller
Architect / Interior Designer Jobs
1 month
United States
Connecticut
0 views
ID: 696461
Published 1 month ago by Cognizant
Check with seller
Hartford, Connecticut, United States
Get directions →
0 item views
Job highlights
Identified by Google from the original job post
Qualifications
You must be legally authorized to work in United States without the need of employer sponsorship, now or at any time in the future *
We are seeking an experienced Architect with 10 to 13 years of experience to join our team
The ideal candidate will have extensive technical skills in Spark in Scala Delta Sharing Databricks Unity Catalog Admin Databricks CLI Delta Live Pipelines Structured Streaming Risk Management Apache Airflow Amazon S3 Amazon Redshift Python Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark
Additionally experience in the Property Casualty Insurance domain is mandatory
Extensive experience with Spark in Scala and Databricks technologies
Proficiency in Delta Sharing and Databricks Unity Catalog Admin
Expertise in Databricks CLI and Delta Live Pipelines
Strong knowledge of Structured Streaming and risk management
Experience with Apache Airflow Amazon S3 and Amazon Redshift
Proficiency in Python and Databricks SQL
Experience with Databricks Delta Lake and Databricks Workflows
Solid skills in PySpark for data processing and analytics
Mandatory experience in the Property Casualty Insurance domain
Ability to work effectively in a hybrid work model
Strong problem-solving and analytical skills
Superb communication and teamwork abilities
Dedication to continuous learning and professional development
Certifications Required
Databricks Certified Data Engineer Associate AWS Certified Solutions Architect Apache Airflow Certification
Benefits
Salary and Other Compensation:
The annual salary for this position is between $81,000 – $140,000 depending on experience and other qualifications of the successful candidate
This position is also eligible for Cognizant’s discretionary annual incentive program, based on performance and subject to the terms of Cognizant’s applicable plans
Medical/Dental/Vision/Life Insurance
Paid holidays plus Paid Time Off
401(k) plan and contributions
Long-term/Short-term Disability
Paid Parental Leave
Employee Stock Purchase Plan
Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting
Responsibilities
But clients need new business models built from analyzing customers and business operations at every angle to really understand them
Supervise the development and deployment of Delta Sharing and Databricks Unity Catalog Admin
Guide in Databricks CLI and Delta Live Pipelines to streamline data workflows
Implement and handle Structured Streaming solutions to ensure real-time data processing
Apply risk management principles to ensure data security and compliance
Use Apache Airflow for orchestrating sophisticated data workflows
Lead data storage and retrieval using Amazon S3 and Amazon Redshift
Develop and maintain Python scripts for data processing and automation
Build and optimize Databricks SQL queries for efficient data analysis
Implement Databricks Delta Lake for scalable and reliable data lakes
Craft and manage Databricks Workflows to automate data pipelines
Apply PySpark for large-scale data processing and analytics
Collaborate with multi-functional teams to ensure data solutions meet business requirements
Provide technical guidance and mentorship to junior team members
Ensure all solutions enforce to industry best practices and company standards
Give to the continuous improvement of data architecture processes and methodologies
Stay updated with the latest industry trends and technologies to drive innovation
Ensure the architecture solutions align with the company goals and objectives
Support the Property Casualty Insurance domain with tailored data solutions
Deliver high-quality scalable and maintainable data architecture solutions
Ensure the hybrid work model is effectively used for efficient productivity
Job description
We are Cognizant Artificial Intelligence

Digital technologies, including analytics and AI, give companies a once-in-a-generation opportunity to perform orders of magnitude better than ever before. But clients need new business models built from analyzing customers and business operations at every angle to really understand them.

With the power to apply artificial intelligence and data science to business decisions via enterprise data management solutions, we help leading companies prototype, refine, validate, and scale the most desirable products and delivery models to enterprise scale within weeks.
• You must be legally authorized to work in United States without the need of employer sponsorship, now or at any time in the future *

This is an onsite position open to any qualified applicant in the United States

Job Title: AWS Data Architect - Databricks

Job summary:

We are seeking an experienced Architect with 10 to 13 years of experience to join our team. The ideal candidate will have extensive technical skills in Spark in Scala Delta Sharing Databricks Unity Catalog Admin Databricks CLI Delta Live Pipelines Structured Streaming Risk Management Apache Airflow Amazon S3 Amazon Redshift Python Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark. Additionally experience in the Property Casualty Insurance domain is mandatory

Roles/Responsibilities
• Own the design and implementation of data architecture solutions using Spark in Scala and Databricks technologies.
• Supervise the development and deployment of Delta Sharing and Databricks Unity Catalog Admin.
• Guide in Databricks CLI and Delta Live Pipelines to streamline data workflows.
• Implement and handle Structured Streaming solutions to ensure real-time data processing.
• Apply risk management principles to ensure data security and compliance.
• Use Apache Airflow for orchestrating sophisticated data workflows.
• Lead data storage and retrieval using Amazon S3 and Amazon Redshift.
• Develop and maintain Python scripts for data processing and automation.
• Build and optimize Databricks SQL queries for efficient data analysis.
• Implement Databricks Delta Lake for scalable and reliable data lakes.
• Craft and manage Databricks Workflows to automate data pipelines.
• Apply PySpark for large-scale data processing and analytics.
• Collaborate with multi-functional teams to ensure data solutions meet business requirements.
• Provide technical guidance and mentorship to junior team members.
• Ensure all solutions enforce to industry best practices and company standards.
• Give to the continuous improvement of data architecture processes and methodologies.
• Stay updated with the latest industry trends and technologies to drive innovation.
• Ensure the architecture solutions align with the company goals and objectives.
• Support the Property Casualty Insurance domain with tailored data solutions.
• Deliver high-quality scalable and maintainable data architecture solutions.
• Ensure the hybrid work model is effectively used for efficient productivity.

Qualifications
• Extensive experience with Spark in Scala and Databricks technologies.
• Proficiency in Delta Sharing and Databricks Unity Catalog Admin.
• Expertise in Databricks CLI and Delta Live Pipelines.
• Strong knowledge of Structured Streaming and risk management.
• Experience with Apache Airflow Amazon S3 and Amazon Redshift.
• Proficiency in Python and Databricks SQL.
• Experience with Databricks Delta Lake and Databricks Workflows.
• Solid skills in PySpark for data processing and analytics.
• Mandatory experience in the Property Casualty Insurance domain.
• Ability to work effectively in a hybrid work model.
• Strong problem-solving and analytical skills.
• Superb communication and teamwork abilities.
• Dedication to continuous learning and professional development.

Certifications Required

Databricks Certified Data Engineer Associate AWS Certified Solutions Architect Apache Airflow Certification

Salary and Other Compensation:

Applications will be accepted until January 16, 2025.

The annual salary for this position is between $81,000 – $140,000 depending on experience and other qualifications of the successful candidate.

This position is also eligible for Cognizant’s discretionary annual incentive program, based on performance and subject to the terms of Cognizant’s applicable plans.

Benefits: Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
• Medical/Dental/Vision/Life Insurance
• Paid holidays plus Paid Time Off
• 401(k) plan and contributions
• Long-term/Short-term Disability
• Paid Parental Leave
• Employee Stock Purchase Plan

Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.

#LI-EV1 #CB #Ind123

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Read more

Published on 2025/09/10. Modified on 2025/09/10.

Description

Job highlights
Identified by Google from the original job post
Qualifications
You must be legally authorized to work in United States without the need of employer sponsorship, now or at any time in the future *
We are seeking an experienced Architect with 10 to 13 years of experience to join our team
The ideal candidate will have extensive technical skills in Spark in Scala Delta Sharing Databricks Unity Catalog Admin Databricks CLI Delta Live Pipelines Structured Streaming Risk Management Apache Airflow Amazon S3 Amazon Redshift Python Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark
Additionally experience in the Property Casualty Insurance domain is mandatory
Extensive experience with Spark in Scala and Databricks technologies
Proficiency in Delta Sharing and Databricks Unity Catalog Admin
Expertise in Databricks CLI and Delta Live Pipelines
Strong knowledge of Structured Streaming and risk management
Experience with Apache Airflow Amazon S3 and Amazon Redshift
Proficiency in Python and Databricks SQL
Experience with Databricks Delta Lake and Databricks Workflows
Solid skills in PySpark for data processing and analytics
Mandatory experience in the Property Casualty Insurance domain
Ability to work effectively in a hybrid work model
Strong problem-solving and analytical skills
Superb communication and teamwork abilities
Dedication to continuous learning and professional development
Certifications Required
Databricks Certified Data Engineer Associate AWS Certified Solutions Architect Apache Airflow Certification
Benefits
Salary and Other Compensation:
The annual salary for this position is between $81,000 – $140,000 depending on experience and other qualifications of the successful candidate
This position is also eligible for Cognizant’s discretionary annual incentive program, based on performance and subject to the terms of Cognizant’s applicable plans
Medical/Dental/Vision/Life Insurance
Paid holidays plus Paid Time Off
401(k) plan and contributions
Long-term/Short-term Disability
Paid Parental Leave
Employee Stock Purchase Plan
Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting
Responsibilities
But clients need new business models built from analyzing customers and business operations at every angle to really understand them
Supervise the development and deployment of Delta Sharing and Databricks Unity Catalog Admin
Guide in Databricks CLI and Delta Live Pipelines to streamline data workflows
Implement and handle Structured Streaming solutions to ensure real-time data processing
Apply risk management principles to ensure data security and compliance
Use Apache Airflow for orchestrating sophisticated data workflows
Lead data storage and retrieval using Amazon S3 and Amazon Redshift
Develop and maintain Python scripts for data processing and automation
Build and optimize Databricks SQL queries for efficient data analysis
Implement Databricks Delta Lake for scalable and reliable data lakes
Craft and manage Databricks Workflows to automate data pipelines
Apply PySpark for large-scale data processing and analytics
Collaborate with multi-functional teams to ensure data solutions meet business requirements
Provide technical guidance and mentorship to junior team members
Ensure all solutions enforce to industry best practices and company standards
Give to the continuous improvement of data architecture processes and methodologies
Stay updated with the latest industry trends and technologies to drive innovation
Ensure the architecture solutions align with the company goals and objectives
Support the Property Casualty Insurance domain with tailored data solutions
Deliver high-quality scalable and maintainable data architecture solutions
Ensure the hybrid work model is effectively used for efficient productivity
Job description
We are Cognizant Artificial Intelligence

Digital technologies, including analytics and AI, give companies a once-in-a-generation opportunity to perform orders of magnitude better than ever before. But clients need new business models built from analyzing customers and business operations at every angle to really understand them.

With the power to apply artificial intelligence and data science to business decisions via enterprise data management solutions, we help leading companies prototype, refine, validate, and scale the most desirable products and delivery models to enterprise scale within weeks.
• You must be legally authorized to work in United States without the need of employer sponsorship, now or at any time in the future *

This is an onsite position open to any qualified applicant in the United States

Job Title: AWS Data Architect - Databricks

Job summary:

We are seeking an experienced Architect with 10 to 13 years of experience to join our team. The ideal candidate will have extensive technical skills in Spark in Scala Delta Sharing Databricks Unity Catalog Admin Databricks CLI Delta Live Pipelines Structured Streaming Risk Management Apache Airflow Amazon S3 Amazon Redshift Python Databricks SQL Databricks Delta Lake Databricks Workflows and PySpark. Additionally experience in the Property Casualty Insurance domain is mandatory

Roles/Responsibilities
• Own the design and implementation of data architecture solutions using Spark in Scala and Databricks technologies.
• Supervise the development and deployment of Delta Sharing and Databricks Unity Catalog Admin.
• Guide in Databricks CLI and Delta Live Pipelines to streamline data workflows.
• Implement and handle Structured Streaming solutions to ensure real-time data processing.
• Apply risk management principles to ensure data security and compliance.
• Use Apache Airflow for orchestrating sophisticated data workflows.
• Lead data storage and retrieval using Amazon S3 and Amazon Redshift.
• Develop and maintain Python scripts for data processing and automation.
• Build and optimize Databricks SQL queries for efficient data analysis.
• Implement Databricks Delta Lake for scalable and reliable data lakes.
• Craft and manage Databricks Workflows to automate data pipelines.
• Apply PySpark for large-scale data processing and analytics.
• Collaborate with multi-functional teams to ensure data solutions meet business requirements.
• Provide technical guidance and mentorship to junior team members.
• Ensure all solutions enforce to industry best practices and company standards.
• Give to the continuous improvement of data architecture processes and methodologies.
• Stay updated with the latest industry trends and technologies to drive innovation.
• Ensure the architecture solutions align with the company goals and objectives.
• Support the Property Casualty Insurance domain with tailored data solutions.
• Deliver high-quality scalable and maintainable data architecture solutions.
• Ensure the hybrid work model is effectively used for efficient productivity.

Qualifications
• Extensive experience with Spark in Scala and Databricks technologies.
• Proficiency in Delta Sharing and Databricks Unity Catalog Admin.
• Expertise in Databricks CLI and Delta Live Pipelines.
• Strong knowledge of Structured Streaming and risk management.
• Experience with Apache Airflow Amazon S3 and Amazon Redshift.
• Proficiency in Python and Databricks SQL.
• Experience with Databricks Delta Lake and Databricks Workflows.
• Solid skills in PySpark for data processing and analytics.
• Mandatory experience in the Property Casualty Insurance domain.
• Ability to work effectively in a hybrid work model.
• Strong problem-solving and analytical skills.
• Superb communication and teamwork abilities.
• Dedication to continuous learning and professional development.

Certifications Required

Databricks Certified Data Engineer Associate AWS Certified Solutions Architect Apache Airflow Certification

Salary and Other Compensation:

Applications will be accepted until January 16, 2025.

The annual salary for this position is between $81,000 – $140,000 depending on experience and other qualifications of the successful candidate.

This position is also eligible for Cognizant’s discretionary annual incentive program, based on performance and subject to the terms of Cognizant’s applicable plans.

Benefits: Cognizant offers the following benefits for this position, subject to applicable eligibility requirements:
• Medical/Dental/Vision/Life Insurance
• Paid holidays plus Paid Time Off
• 401(k) plan and contributions
• Long-term/Short-term Disability
• Paid Parental Leave
• Employee Stock Purchase Plan

Disclaimer: The salary, other compensation, and benefits information is accurate as of the date of this posting. Cognizant reserves the right to modify this information at any time, subject to applicable law.

#LI-EV1 #CB #Ind123

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.
Cognizant
Cognizant
4939 active listings

Recently viewed

Uniting Vic.Tas Uniting Vic.Tas 1 month
Remote Jobs 1 month
Intake Case Worker
Check with seller
Intake Case Worker
Remote work from home position Access salary packaging to increase take-home pay Part-time (45.6 hours per fortnight), Max term until 21/12/2025 Your new role As an Intake and Case Worker, you will assess, triage and book clients into CareRing services and provide and facilitate appropriate referral/appointments to services available within and external to U...
1 month Remote Jobs views
Check with seller
United Parcel Service United Parcel Service 1 month
Mechanic Jobs 1 month
Driver|Helper
Check with seller
Driver|Helper
Position: Driver / Helper - Urgently Hiring Location: Big Sky SHIFT YOUR FUTURE Seasonal Driver Helper SHIFT YOUR TEAM As a Driver Helper this is a position where you’ll ride along with a Package Delivery Driver in our famous brown trucks and help get packages of all shapes and sizes exactly where they need to be. It’s a lot of in and out of the truck, makin...
1 month Mechanic Jobs views
Check with seller
Confidential Confidential 1 month
JOB Offer We Are Hiring Full Time Job For Mahindra Mahindra Limited
Check with seller
JOB Offer We Are Hiring Full Time Job For Mahindra Mahindra Limited
JOB Offer We Are Hiring Full Time Job For Mahindra Mahindra Limited We Want You Apply Now… Showroom Work 1.Job place- Home' Town Near by Your District 2.Salary- 16k To 26k 3.Timing- 10am to 5pm sat-sun off 4.Training- 7 days plus salary 5. Accommodations- Living, fooding, Traveling, Cab, Health Insurance Provided. 6.Documents- Resume, passport size photo, Aa...
1 month Labour/Helper Jobs views
Check with seller
John Deere John Deere 1 month
National Training Manager
Check with seller
National Training Manager
Full job description Who We Are Integrity. Quality. Humanity. Commitment. Innovation. These are our values and not only what we stand by but what we stand for. We believe in empowering people. We create and deliver solutions. We give back to community. We think differently and we do it better. Our innovative spirit has driven us to continually evolve and del...
1 month Admin Executive views
Check with seller
AvePoint AvePoint 1 month
Partner Development Manager
Check with seller
Partner Development Manager
Location Melbourne VIC 3000•Hybrid work   Benefits Pulled from the full job description Referral program Tuition reimbursement Health insurance Annual leave   Full job description Partner Development Manager Sydney or Melbourne, Australia About AvePoint Securing the Future. AvePoint is a global leader in data management and data governance, and over 21,000 c...
1 month MIS Executive views
Check with seller
ACL Digital   ACL Digital 1 month
Data Architect
Check with seller
Data Architect
Job highlights Identified by Google from the original job post Qualifications A minimum of 10 years of experience in data architecture, data analysis, or data management Strong proficiency in working with large complex data sets Proficiency in SQL, NoSQL, and relational databases (e.g., MongoDB, MySQL, PostgreSQL, etc.) Proficient in Cloud platforms ( AWS/Az...
1 month Architect / Interior Designer Jobs views
Check with seller
Are you a professional Recruiter? Create an account