- Builds data architecture and ETL processes for ingesting, cleansing, staging, migrating large datasets from one database to another or multiple sources to a datawarehouse.
- Write unit test scripts and perform QA for various processes/procedures.
- Write scripts and modularized stored procedures to automate tasks and setting up applications.
- Demonstrate strong problem-solving capabilities and approach each problem in a logical and analytical manner with high degree of perseverance; produce definitive statement of the issues, and identification of alternative solutions, the recommended course of action, and the consequence of the decision.
- Anticipate and keep manager and project staff informed of future or planned events that could impact implementations
- Bachelor's Degree in Computer Science, Information Systems or related technical field, required.
- Expert knowledge of one of the ETL tools
- Expert knowledge of data integration, data warehouse, data lake and RDMS (at least one of these Azure Database, Aurora, MySQL, SQL server or Oracle) Minimum of five (5) years progressively responsible experience with SQL programming, reporting methodologies, and development of Business Intelligence dashboard design, development and deployment, required
- Familiarity with Cloud (Azure, AWS, etc) technologies • Experience with version control (SVN, Git, etc), release cycles and creating documentation
- Independent, a quick learner and familiar with agile methodologies