Data Architect
Intermountain Health
Job Description:
The Data Architect/Data Engineer is responsible for designing, implementing, and maintaining both traditional and modern data infrastructures at the staff level. This role combines expertise in relational databases, cloud technologies, and big data tools to create scalable, efficient, and secure data solutions. The data architect/engineer develops strategies for integrating enterprise data, ensuring data quality, and optimizing data access across both cloud and on-premise systems. The staff-level position works on moderate to complex projects and is mentored by senior-level data architects/engineers to deliver robust data models, pipelines, and warehouses that meet organizational needs using technologies like SQL, Python, PySpark, and traditional RDBMS platforms.
Job Essentials
1. Records detailed logical and physical data models in the information repository and coordinates the publication and maintenance of these models.
2. Facilitates database design sessions and translates business requirements into logical and physical system designs for moderately complex projects.
3. Coordinates the physical database design and implementation process for moderately complex projects. Designs the optimal data presentation environment for the customer and works with various reporting tools.
4. Coordinates with specific database performance monitoring and tuning tasks including the design of optimization and indexing schemes for moderately complex projects.
5. Develops, tests and implements ETL (extract, transformation. and load) routines.
6. Documents business requirements and essential functionality requirements for in-house developments.
7. Ensures proper integration with existing Intermountain Healthcare?s data, standards, and business processes.
8. Provides leadership for moderately complex projects through development of project work plans, recommendations for optimal utilization of resources, milestone tracking, and coordination of project team efforts. Keeps management and customers informed of progress and exceptions.
9. Assists in the development and implementation of system backup, recovery, and support plans.
10. Addresses the customer needs, manages expectations and ensures system-wide integration.
11. Participates in the technical review process of designs completed by other designers.
12. Develops, documents, and delivers complex ad-hoc queries of production data as requested by business users.
13. Consults with the business users that are performing queries. Works effectively with clients and staff to determine needs, specifications, feasibility, and project priorities.
Minimum Qualifications
Bachelors degree in Computer Science, Bioinformatics, Statistics or closely related field. Degree must be obtained through an accredited institution. Education is verified.
Three years experience in database design, development, and business intelligence technologies.
Demonstrated high degree of flexibility and motivation in an aggressively changing environment.
Demonstrated general knowledge of SQL (structured query language).
Demonstrated interpersonal communication skills to promote and advance business intelligence in collaboration with our customers.
Demonstrated project management skills, customer service skills, as well as demonstrated organizational skills.
Demonstrated ability to deal with highly sensitive and confidential material and adhere to data security and confidentiality protocols and procedures.
Demonstrated knowledge of general data warehousing concepts and understands the primary differences between On-line Transaction Processing and Decision Support systems and how to appropriately support the requirements and environments for each.
Preferred Qualifications
Experience with Databricks, Apache Spark, and Delta Lake for real-time and batch data processing.
Proficiency in data streaming technologies such as Kafka, AWS Kinesis, or Azure Event Hubs.
Experience working with APIs to retrieve and integrate data from external systems.
Experience developing APIs to provide data as a product.
Familiarity with programming languages such as Python and PySpark for data engineering tasks.
Strong knowledge of ETL/ELT processes and tools, including both traditional (e.g., SSIS, Informatica) and cloud-native solutions (e.g., Azure Data Factory, Databricks).
Excellent communication skills for collaborating with stakeholders and teams.
Physical Requirements:
Interact with others requiring the employee to communicate information.
Operate computers and other IT equipment requiring the ability to move fingers and hands.
See and read computer monitors and documents.Remain sitting or standing for long periods of time to perform work on a computer, telephone, or other equipment.
Location:
Lake Park Building
Work City:
West Valley City
Work State:
Utah
Scheduled Weekly Hours:
40
The hourly range for this position is listed below. Actual hourly rate dependent upon experience.
$43.49 – $68.48
We care about your well-being – mind, body, and spirit – which is why we provide our caregivers a generous benefits package that covers a wide range of programs to foster a sustainable culture of wellness that encompasses living healthy, happy, secure, connected, and engaged.
Learn more about our comprehensive benefits packages for our Idaho, Nevada, and Utah based caregivers (https://intermountainhealthcare.org/careers/working-for-intermountain/employee-benefits/) , and for our Colorado, Montana, and Kansas based caregivers (http://www.sclhealthbenefits.org) ; and our commitment to diversity, equity, and inclusion (https://intermountainhealthcare.org/careers/working-for-intermountain/diversity/) .
Intermountain Health is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, age, sex, sexual orientation, gender identity, national origin, disability or protected veteran status.
All positions subject to close without notice.