Job Description
The Data Architect will design, develop, and support enterprise data solutions by integrating complex data from various sources, implementing data governance practices, and ensuring high-performance, secure, and scalable data architectures to meet business needs.
Key Responsibilities
- Design, develop, implement, and support enterprise data solutions including databases, data warehouses, data lakes, and data marts.
- Gather and document data, integration, and development requirements from cross-functional teams to meet business needs.
- Create and maintain data models for complex datasets using modern data access and storage methodologies.
- Lead data management and governance practices, establishing data quality standards and compliance policies.
- Design and oversee data integration from multiple sources and formats, including large-scale ingestion of Excel files.
- Manage and support production databases, data warehouses, and ETL processes ensuring security, scalability, and reliability.
- Provide expertise in data analytics tools such as Power BI, PowerQuery, and Excel for reporting and business insights.
- Monitor, assess, and optimize the performance of data systems to ensure efficient data processing.
- Ensure proper configuration management, change control, and documentation of data solutions.
- Participate in business analysis activities to gather data requirements and translate them into technical solutions.
Requirements
- Bachelor's degree in Computer Science, Data Science, or relevant field.
- 5 years experience designing and developing with Microsoft SQL Server.
- 5 years experience developing with Python, including data parsing and transformation.
- 2 years experience working with SAP General Ledger and performing integration migration tasks with SAP data.
- Hands-on experience with data governance tools such as Microsoft Purview.
- Demonstrated ability to ingest, parse, and transform data from a wide variety of formats, including the ability to process and parse hundreds of Excel files as part of the data ingestion pipeline.
- Experience with Microsoft Fabric, Excel data modeling, Power BI, and PowerQuery.
- 4 years experience designing and developing data warehouses.
- 3 years experience with modern data technologies such as Snowflake, Spark, Kafka.
- 3 years experience developing and deploying ETL solutions.
- 3 years experience working with data pipeline and orchestration services such as AirFlow, SSIS, Azure Data Factory, AWS Data Pipeline Glue, or FiveTran.
- Expert knowledge of logical and physical data modeling concepts for relational and dimensional data.
- Experience performance tuning for large, complex data sets.
- Experience working with Git.
- Experience working with structured, semi-structured, and unstructured data sources.
- Familiarity with DevOps and database deployment automation tools such as CI/CD and SSDT.
Benefits & Perks
Expected salary range of 135,000 - 175,000, based on experience and location
Annual bonus program
401(k) with company match
Equity incentive program
Comprehensive medical, dental, and vision benefits
Paid time off for vacation, holidays, and sick days
Ready to Apply?
Join Hasi and make an impact
Stay Updated on Sustainability Jobs
Get the latest renewable energy jobs and career tips delivered to your inbox.
Job Alerts
Get notified about new sustainability jobs