PySpark Data Engineer, Consultant
Company: Deloitte
Location: Phoenix
Posted on: May 16, 2022
Job Description:
The Analytics & Cognitive team leverages the power of data,
analytics, robotics, science and cognitive technologies to uncover
hidden relationships from vast troves of data, generate insights,
and inform decision-making. Together with the Strategy practice,
our Strategy & Analytics portfolio helps clients transform their
business by architecting organizational intelligence programs and
differentiated strategies to win in their chosen markets.
Analytics & Cognitive will work with our clients to:
- Implement large-scale data ecosystems including data
management, governance and the integration of structured and
unstructured data to generate insights leveraging cloud-based
platforms
- Leverage automation, cognitive and science-based techniques to
manage data, predict scenarios and prescribe actions
- Drive operational efficiency by maintaining their data
ecosystems, sourcing analytics expertise and providing As-a-Service
offerings for continuous insights and improvements
Qualifications
Required:
- 3+ years of relevant technology consulting or industry
experience to include experience in Information delivery, Analytics
and Business Intelligence based on data
- 3+ years experience in Python and/or R
- 3+ years experience in SQL
- 3+ years experience PySpark
- 2+ years of hands on experience with data core modernization
and data ingestion.
- 1+ years experience leading workstreams or small teams
- Bachelor's Degree or equivalent professional experience
- Travel up 50% (While 50% of travel is a requirement of the
role, due to COVID-19, non-essential travel has been suspended
until further notice
- Limited immigration sponsorship may be available.
Preferred:
- An advanced degree in the area of specialization is
preferred.
- Experience with Cloud using Amazon Web Services (AWS),
Microsoft Azure, and/or Google Cloud Platform (GCP)
- Experience with Spark, Scala
- Understanding of the benefits of data warehousing, data
architecture, data quality processes, data warehousing design and
implementation, table structure, fact and dimension tables, logical
and physical database design, data modeling, reporting process
metadata, and ETL processes.
- Experience designing and implementing reporting and
visualization for unstructured and structured data sets
- Experience designing and developing data cleansing routines
utilizing typical data quality functions involving standardization,
transformation, rationalization, linking and matching
- Knowledge of data, master data and metadata related standards,
processes and technology
- Experience working with multi-Terabyte data sets
- Experience with Data Integration on traditional and Hadoop
environments
- Strong oral and written communication skills, including
presentation skills (MS Visio, MS PowerPoint).
Keywords: Deloitte, Phoenix , PySpark Data Engineer, Consultant, Engineering , Phoenix, Arizona
Didn't find what you're looking for? Search again!
Loading more jobs...