What You’ll Do:
- Develop processes and data models for consuming large quantities of 3rd party vendor data via RESTful APIs.
- Develop data processing pipelines to analyze, transform, and migrate data between applications and systems.
- Analyze data from multiple sources and negotiate differences in storage schema using the ETL process.
- Develop APIs for external consumption by partners and customers.
- Develop and support our ETL environment by recommending improvements, monitoring, and deploying quality and validation processes to ensure accuracy and integrity of data.
- Design, develop, test, deploy, maintain, and improve data integration pipelines.
- Create technical solutions that solve business problems and are well engineered, operable, and maintainable.
- Design and implement tools to detect data anomalies (observability). Ensure that data is accurate, complete, and high quality across all platforms.
- Troubleshoot data issues and perform root cause analysis to proactively resolve product and operational issues.
- Assembles large and complex data sets; develops data models based on specifications using structured data sets.
- Develops familiarity with emerging and complex automations and technologies that support business processes.
- Develops scalable and re-usable frameworks for ingestion and transformation of large datasets.
- Work within an Agile delivery / DevOps methodology to deliver product increments in iterative sprints.
- Working with stakeholders including the Executive, Product, Data and Design teams to support their data infrastructure needs while assisting with data-related technical issues.
- Develop data models and mappings and build new data assets required by users. Perform exploratory data analysis on existing products and datasets.
- Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.
- Engage in logical and physical design of databases, table creation, script creation, views, procedures, packages, and other database objects.
- Create documentation for solutions and processes implemented or updated to ensure team members and stakeholders can correctly interpret it.
- Design and implement processes and/or process improvements to help the development of technology solutions.
Qualifications & Interests:
- 5+ years of relational database development experience; including SQL query generation and tuning, database design, and data concepts.
- 5+ years of backend and Restful API development experience in NodeJS (experience with GraphQL a plus).
- 5+ years of development experience with the following languages Python, Java, C#/ .NET.
- 5+ years of experience with SQL and NoSQL databases; including MS SQL and MongoDB.
- 5+ years consuming RESTful APIs with data ingestion and storage.
- 5+ years developing RESTful APIs for use by customers and 3rd
- 3+ years of professional work experience designing and implementing data pipelines in a cloud environment.
- 3+ years of experience working within Azure cloud.
- Experience in integrating and ingesting data from external data sources.
- Strong diagnostic skills and ability to research, troubleshoot, and logically determine solutions.
- Ability to effectively prioritize tasks in a fast-paced, high-volume, and evolving work environment.
- Comfortable managing multiple and changing priorities, and meeting deadlines.
- Highly organized, detail-oriented, excellent time management skills.
- Excellent written and verbal communication skills.