Senior Consultant/developer - Big Data Etl & Analytics Resume
Greenwood Village, CO
SUMMARY:
- Results - driven Software Development professional with extensive experience in Data Warehouse and Business Intelligence fields.Capable of designing, implementing and optimizing data extraction/ingestion, transformation and load processes in the Hadoop ecosystems using Big Data processing techniques and tools.
- Proven history of building and managing large-scale ETL pipelines from initial inception to production.
- A team player with solid experience in both Waterfall and Agile Scrum software development environments.
SKILLS:
DB/BI ETL Tools: Azkaban, Hive, HQL, UNIX Bash/Shell Scripting, Microsoft SSIS, SSAS, T-SQL, Oracle PL/SQL, Datapump, SQL Loader.
DB Administration: Oracle 11g, SQL Server 2012/2014 Enterprise Edition.
Reporting Tools: Microsoft SSRS.
IDE/DB Tools: Atom, DBeaver, SQL Server Mgmt Studio/SSMS, SSDT (Data Tools), Visual Studio, DB Schema, Oracle SQL Developer, TOAD, SQLWorkbench, ERwin, AWS Management Console.
DB monitoring: New Relic Analytics using SQL Server plugin, Librato Cloud Monitoring.
Source Control/Software Quality: GitLab, Jenkins, Atlassian JIRA, Microsoft TFS Source Control, UNIX RCS.
Operating Systems: Microsoft Windows Server 2012, UNIX, Mac OS.
Misc: Google Docs, Microsoft Office, Visio Diagram, HTML, XML, HipChat, Slack.
PROFESSIONAL EXPERIENCE:
Confidential, Greenwood Village, CO
Senior Consultant/Developer - Big Data ETL & Analytics
Responsibilities:
- Develop Hadoop ETL solutions that include data ingestion, staging, loading and aggregation for accounts and equipments data (daily ingest of 180 millions records of denormalized data).
- Interact with business stakeholders and source systems to gather data requirements and provide analysis to leverage data value across the enterprise.
- Building out ingest pipelines on other core data for data science team and analytics reporting solutions.
- Research data issues including data validation and reconciliation.
- Create MOP (Manual of Operating Procedures) for various ETL ingest flows.
- Working with the DevOps team to increase monitoring and alerting coverage.
- Contributing to ongoing maintenance of existing infrastructure platform and investigating issues and failures.
- Experience with large migration projects to a new Hadoop cluster via AWS S3 server.
- Big Data technologies used: Hadoop on Hortonworks platform, HDFS, Hive, HQL, Pig, Sqoop, Ambari, Azkaban workflow engine/job scheduler, Jenkins, Gitlab, LINUX Bash shell scripting.
Confidential, Bellevue, WA
Senior BI / DBA Consultant
Responsibilities:
- Developed ETL packages.
- Setup and managed reporting portals and built reports.
- SSAS (Analysis Services)
- Designed and developed multidimensional data cubes.
- TSQL: DDL, DML, analytic & aggregation functions, stored procedures, complex queries, MDX.
SQL Server Database Administration (DBA)
Confidential
Responsibilities:
- Designed and implemented a SQL Server relational database architecture for OLTP/transactional and replication/reporting databases on AWS cloud servers.
- Normalization and Dimensional data modeling of an OLAP star schema with fact & dimension tables.
- Built and managed databases, tables, indexes, views, datafiles, backups, replication, SQL agent jobs, alerts.
- Performance tuning and monitoring using SQL Trace, Profiler, PerfMon tools.
- Designed and built AWS VPC (Virtual Private Cloud) infrastructure.
- Maintained AWS servers (AWS stack: EC2, S3, RDS instances), EBS drives, security groups.
Confidential, Greenwood Village, CO
Senior Oracle Developer
Responsibilities:
- Built ETL applications using Oracle PL/SQL.
- Designed and built custom reports using BI Publisher with XML input integration.
- Developed stored procedures and scripts that encapsulates business rules.
- Tuned and optimized complex SQL queries.
Confidential, Englewood, CO
Senior Analyst/Developer
Responsibilities:
- Built ETL applications using UNIX Shell script, Oracle PL/SQL, SQL*Loader, MVs, external tables.
- Worked with ~20 TB of historical data in 100 tables with multiple partitions and tablespaces.
- Developed monthly batch programs to load multiple very large data sets and do data transformations / aggregations to be used for BRIO/Hyperion and Oracle (OBIEE) reports.
- Developed and maintained batch jobs automation using AutoSys.
- Created and maintained database table structures, views, indexes, partitions.
- Conducted SQL tuning; perform various data analysis requests from business/internal clients.
Confidential, Lakewood, CO
Software Development Consultant
Responsibilities:
- Developed PL/SQL scripts in Rapid Application Development/Rational Unified Process environment.
- Developed and troubleshoot IBM Rational Robot and HP QuickTest Pro/VBScript scripts.
- Performed application testing and performance testing.