Data Architect (Snowflake)

  • Illinois
  • Nextrow
Job Description NextRow is You must be a problem solver who looks to elevate the work of everyone around them.

- Provides expert guidance to projects to ensure that their processes and deliverables align with the Client target state architecture.

- Defines & develops enterprise data architecture concepts and standards leveraging leading architecture practices and advanced data technologies.

- Requirements gathering with business stakeholders, domain: agile team work, people data, and hierarchies of project portfolio work

- Ability to write requirements for ETL and BI developers

- Ability to write designs for data architecture of data warehouse or data lake solutions or end to end pipelines

- Expert in data architecture principles, distributed computing knowhow

- Intake prioritization, cost/benefit analysis, decision making of what to pursue across a wide base of users/stakeholders and across products, databases and services,

- Design or approve data models that provide a full view of what the Client technology teams are working on and the business impact they are having.

- End to end data pipeline design, security review, architecture and deployment overview

- Automate reporting views used by management and executives to decide where to invest the organization's time and resources, and stay up to date on key company initiatives and products

- Create self-service reporting including a data lake for Client's internal projects and resources

- Design for comprehensive data quality management tooling.

QUALIFICATIONS:

- 6+ years of experience within the field of data engineering or related technical work including business intelligence, analytics

- 4+ years of experience in architecture for commercial scale data pipelines

- Experience and comfort solving problems in an ambiguous environment where there is constant change. Have the tenacity to thrive in a dynamic and fast-paced environment, inspire change, and collaborate with a variety of individuals and organizational partners

- Experience designing and building scalable and robust data pipelines to enable data-driven decisions for the business

- Exposure to Amazon AWS or another cloud provider

- Experience with Business Intelligence tools such as Tableau, ThoughtSpot, PowerBI and/or Looker

- Familiarity with data warehousing platforms and data pipeline tools such as Redshift, Snowflake, SQL Server, etc.

- Passionate about programming and learning new technologies; focused on helping yourself and the team improve skills

- Effective problem solving and analytical skills. Ability to manage multiple projects and report simultaneously across different stakeholders

- Rigorous attention to detail and accuracy

- Aware of and motivated by driving business value

- Experience with large scale enterprise applications using big data open-source solutions such as Spark, Kafka, Elastic Search / Solr and Hadoop, HBase

- Experience or knowledge of basic programming and DB's technologies (SQL, Python, Cassandra, PostgreSQL, AWS Aurora, AWS RDS , MongoDB, Redis, Couchbase, Oracle, MySQL, Teradata)

- Bachelor's degree in Engineering, Computer Science, Statistics, Economics, Mathematics, Finance, a related quantitative field

- Advance CS degree is a plus knowledge and experience