We provide IT Staff Augmentation Services!

Senior Etl Consultant Resume

4.00/5 (Submit Your Rating)

OBJECTIVE:

Pursuing as a Senior Data Warehouse ETL Developer/Python Developoer/SQL developer to utilize 18 year experience in SSIS, Informatica, Ab Initio, SQL, Python and Hadoop.

SUMMARY:

  • 15 year experience in ETL tools such as SSIS, Ab Initio, Informatica, DataStage, Talend and BI tools such as SSAS/SSRS, COGNOS, PowerCube, Transformer, Tabular Reports and BusinessObjects etc.
  • 12 years Data Warehousing experience in ETL, Data Architecture and Modeling including STAR and multidimensional schema and snowflake schema design as well as Hadoop development.
  • 15 year data modeling, data analysis and development experience in SQL, TSQL, PL/SQL, HiveQL, MDX including 10 year Oracle, 7 year DB2, 7 year MS SQL Server and 5 year Sybase experience.
  • 10 year intensive coding experience in Unix Scripting, C/C++/C#, Java, Perl As well as 5 year mainframe experience in TSO/JCL/COBOL/SAS.
  • In depth knowledge in Agile development, Python, Machine Learning, Big Data/HADOOP, Cloud, Tableau, distributed processing, Jenkins, CI/CD process, AML, Web Service, SDLC/PMLC.
  • Positive and responsible attitude, fast learner and willingness to work hard in a fast - paced, multi-tasking and collaborative team environment.

TECHNICAL SKILLS:

D/W Technologies: D/W Lifecycle, Normalization, Denormalization, Hierarchy, Star schema, Snowflake schema, Dimension, Multi-dimension, Data Modeling, SCD, COGNOS Impromptu, IWR, PowerPlay, Transformer, Catalog, Ad-hoc reports, PowerCube, MDX queries

Unix Technologies: GNU, Unix Libc, Unix kernel, Crontab, Berkeley Socket C, POSIX threads, Solaris threads, TCP Client/Server, IPC, Message Queue.

Database: ORACLE 8/9i/10g/11g/12c, ORACLE Application Server, DB2, Informix, Microsoft SQL Server 2008/2010/2012/2014/2016

Database Tools: ERwin, Coolgen, ORACLE Developer, Designer, ORACLE SQL Loader, Materialized View, Data Dictionary, ORACLE Reports, Toad

Languages: Unix C, Perl, ORACLE PRO C, PL/SQL, B/C/Korn Shell Script, COBOL, JCL, Esql C, SAS, J2EE, Python, Scala, Store Procedures, Kafka, ASP.NET, .NET

Platforms: Solaris, HP UNIX, AIX, Linux, Mainframe/OS/390, WIN NT, XP, Lotus Notes, Hadoop Distributed File System (HDFS), WINDOWS

Networks: TCP/IP, FTP, MQ-Series, SNMP, SNA, VPN, Bloomberg

Development Tools: Ab Initio, InfoSphere DataStage, Informatica, Microsoft DTS/DQS/MDS, Power BI, VB Script, Quality Stage, Talend, Hadoop, Source safe, UltraEdit, MS Project and Office, MS Visio, Mercury Test Director, Lotus Notes, AQT, ESP, Autosys, Share Point, Beyond Compare, Notepad++, Web API, WINSCP., Eclipse, Spark, ESP, Aginity 4.3, COGNOS Impromptu, SAP, SQL Server Profiler, Kafka, Json, Cloudera, JMS, wiki, Automation Test

PROFESSIONAL EXPERIENCE:

Confidential

Senior ETL Consultant

Environment: Oracle, SQL server, SSIS/SSAS/SSRS, Python 3/2, Anaconda, scikit-learn, Machine Learning, C#, Tabular, DevOps, Jenkins, UrbanCode Deploy, DAX, Tableau, Tibco, GEMS, Jira, AgileWrite SSIS packages and C# to extract/transform/load data from EDW to CSAD data mart using execute SQL task, script task, script component, JOIN, derived column, Lookup and etc.

Responsibilities:

  • Write C# in SSIS script component and script task to do the key cutting and expire historical trade transactions.
  • Use Jenkins and UrbanCode Deploy codes to SIT/UAT/PROD to automate the development and testing., Use Tibco GEMS to create and drop message in queues to trigger Python and SSIS jobs
  • Create complex SQL queries to keep track of 7 bitemporal slowly change dimensions and update the dimension keys in fact tables
  • Use Python to do multiprocessing and multithreading API call vendor Product OneTick Cloud to get market price for equity, future and option etc. to be used for data analysis.
  • Use Python to implement complex business logic to compute Slippage to evaluate trade efficiency of traders and brokers based on timing and price differences.
  • Use SSAS to create OLAP Cube and Tabular Model and DAX queries and use SSRS/Tableau to create Reports/Dashboards and data discovery.
  • Use Explain Plan to tune performance of T - SQL in SQL server and PL/SQL in Oracle to identify the bottle neck of slow queries and create indexes, partition tables to improve the performance.

Confidential

Senior Data Warehouse ETL Consultant

Environment: SQL Server 2012, T-SQL, SSIS/SSAS/SSRS, Python 3, Pandas, Numpy, IBM Netezza 7.0, Abinitio, Microsoft Visual Studio 2010/2012, TFS, C#/C++, DevOps, Jira, SharePoint, Hortonworks HDP, Bitbucket, github, Java 8, Core Java, Web Service, Hive, HQL, Hbase, NoSQL, SPARK, Scala, Kafka.

Responsibilities:

  • Gathered business requirements, definition and design of the data sourcing and data flows, data quality analysis, worked with the data architect on the development of logical and data models.
  • Identified and fixed a hidden critical issue for Central Banking System (MECH), saving 6 million transactions from 6 regions while originally only taking in one region.
  • Extensively designed and created mappings using SSIS transformations such as Lookup, Derived Column, Data conversion, Aggregate, SQL task, Script task and Send Mail task etc.
  • Used C# to preprocess and read, verify, split and merge source flat files, ebcdic files and XML files and the used SSIS to load into IDP to be used by downstream and web services.
  • Developed complex sql scripts to transform data using inner, outer joins, sub queries and WITH queries.
  • Wrote powershell scripts using environment variables and parameters to dynamically deploy SSIS packages to DEV/SIT/UAT/PROD.
  • Wrote Python codes to read from Oracle, SQL server and csv files and then used Pandas Dataframe to group and aggregate financial data for data analysis and reporting.
  • Created base class and sub classes in Python to read different line of business and transformed and calculated the data based on various business requirement.
  • Moved source data into Hadoop HDFS and wrote HiveQL to load from HDFS into Hive Level 1/2/3 and Hbase, converted Netezza SQL into HiveQL and moved Data Warehouse into Hive.
  • Designed and developed SSAS databases and OLAP cubes and created Matrix reports, Chart, List, Subreports and Tabular Reports for data visualization in SSRS for financial reporting.

Confidential

Se nior ETL developer

Environment: Ab Initio 3.1.5, EME, AIX 7.1, Oracle 11G, KSH scripting, Jira, blueprint, SOAP, SAS 9.4, PL/SQL, Java, C/C++/C#

Responsibilities:

  • Completed client linkage project with zero UAT defect and deployed to PROD within one month, 30% ahead of schedule, which is to identify new born babies sharing health cards with mothers.
  • Took key ETL design and development role in client linkage and HSMR projects, developed Ab Initio graphs using various transformations such as lookup, rollup, filter, scan and normalize etc.
  • Developed Ab Initio graph and use SOAP component to call SAS codes to calculate performance trending values, which can pass 100 organizations per record to save time by 95%.
  • Competed complex queries using Union, Sub query, outer JOIN to extract data from source systems, use execution plan to optimize queries and improve performance.
  • Positive attitude and clarified requirement proactively with BA, identified 3 potential requirement issues and fixed them ahead of time before creating defects later in QA.
  • Attended issue review meetings daily with technical lead, QA, users, data modeler and project lead and developed code and supported QA/UAT/Stress Testing in an agile development.
  • Wrote Unix wrappers and set parameters if needed to run Ab Initio jobs by air command.

Confidential

Senior Data Warehouse ETL Consultant

Environment: SSIS/SSAS/SSRS, Ab Initio 3.1.7/3.0.3 , EME, Solaris, Sybase IQ, DBVisualizer 8.0.2, C/C++/C#, Java, QTP, LoadRunner, Autosys 11, SAS Enterprise guide 4.3, ITIL, Microstrategy, ASP.NET, SQL, mainframe.

Responsibilities:

  • Took ETL design and development role in MACH2 and CAD Project, which is to convert BMO credit card Consumer data into TSYS and create a central data repository.
  • Urgently took over coaching tool migration and completed the development in 1 month while it has been paralyzed for 4 months due to lack of requirement communications.
  • Worked on ETL transformation on different projects such as Equifax Credit Risk Project and MECH Mortgage transactions and create financial and regulatory reports.
  • Designed SSIS Packages to transfer data from flat files and mainframe ecbdic files to Sybase IQ data warehouse using Business Intelligence Development Studio.
  • Used Package Configurations and created deployment scripts, environmetn variables to make use of same SSIS package in Dev/Testing/Prod Environments
  • Extensively used SSIS transformations such as Lookup , Derived column, Data conversion, Aggregate, Conditional split, SQL task, Script task and Send Mail task etc.
  • Used Ab Initio to develop various transformations including join, rollup, scan and normalize to fulfill business rule as well as created EME tag to promote codes to QA/Production.
  • Use KSH/C/C++ to run unix wrappers and air commands as well as data cleansing and data standardization...

Confidential

Senior Data Warehouse Developer

Environment: SSIS/SSAS/SSRS, Python, Pandas, Scipy, Ab Initio 3.0/2.15, EME, Talend, SQL Server, Sybase, HDP Hadoop,Hive, Java 7, Linux, Hermes, IssueView, Unix Script, ITIL, .NET, C/C++/C#, BCP.

Responsibilities:

  • Organized the meeting among US, UK, China, Canada and Indian Teams once a month and did the data architecture and modeling and design of the new DW OLAP database.
  • Reviewed QA test plan, provided QA support and delivered the codes to PROD weekly in an Agile environment and resolve the production issue if any within half an hour.
  • Develop SSIS packages using SSIS data flow tasks, script task and look up transformations and to extract, transform and load the data into data warehouse.
  • Used Python to connect to Oracle and SQL Server and extract trade data from capital market including stock, future and bond etc. to be used for data analysis and reporting
  • Created SSIS packages for File Transfer from one location to the other using FTP task.
  • Used Ab Initio and Store Procedures to convert Prime Finance and Trade OLTP data into OLAP data warehouse, use SSAS/SSRS to create cube and various reports and monitor market risk.
  • Extensively wrote Talend ETL logic using tMap, tAggregate, tNormalize, tJavaRow, tHiveRow and Custom Java Codes and loaded into OLAP and Hadoop Hive Database.
  • Tuned SQL and Store Procedures using explain plan, identified bottle neck, created proper index and used efficient archival strategy to improve performance.
  • Led production release and check out, provided 7*24 production support in rotation.

Confidential

Senior Data Warehouse ETL Developer

Environment: Ab Initio, BRE, InfoSphere Information Sever, Cognos, DataStage, QualityStage, SSIS/SSAS/SSRS, IBM IDE Rational, AIX, DB2, Oracle, MQ, IssueView, SOAP, NDM, Mainframe.

Responsibilities:

  • Reviewed the functional spec and mapping spreadsheet with BA/BI/QA, provided valuable feedback and recommendation and drew out technical design based on the requirement.
  • Drew out technical design, led the GR resources from Europe and India, monitored the progress, reviewed the codes and test result to deliver to PROD in time with good quality.
  • Used Ab Initio to write ETL transformation logic based on mapping rules as well as implemented continuous flow using MQ Subscribe/Publish and continuous components.
  • Replaced the Join with DB component with unload and take advantage of Partition with Load balance component, reduced the total time of processing 12 billion records from 12h to 3h
  • Wrote Korn Shell scripts/C/C++ and used SED and AWK to pre format the source and do data cleansing.
  • Used Prism to create Cobol ETL codes and uploaded it to the mainframe, the output data files are pushed to AIX via NDM jobs once the Cobol programs are compiled and run through ISPW

Confidential

Data Conversion Consultant

Environment: DB2, SQL Server, Erwin, DataStage, Mainframe, GladStone, COBOL, SAS, Focus.

Responsibilities:

  • Designed and created logical and physical data models based on business requirements using Erwin and generate DDL using forward engineering.
  • Used GladStone to generate COBOL codes and Submitted JCL jobs on mainframe to unload data from DB2 and download to PC using universal command.
  • Transformed and converted data from 3rd party Unum Insurance into IASP (policy), FINEOS (claim) data using DataStage and scheulde the jobs in Director.
  • Extensively wrote SAS programs to do the data analysis, merge files and generate reports to fulfill users' requirement.
  • Collaboratively and effectively worked with US colleagues and provided guidance and review to contribute to high quality code delivery.
  • Proactively worked with BSA to received and reviewed the conversion requirement, raised questions to make sure requirement is fully understood and suggestion to improve if necessary.

Confidential

Team Leader

Environment: Solaris, AIX, ORACLE, CoolGen, Erwin, B/C/Korn Script, C/C++, Cron jobs, Perl 5, Informatica, Cognos Impromptu/PowerCube, Autosys, J2EE, WebSphere, mainframe COBOL.

Responsibilities:

  • Led ESP (European Strategy Platform) and Switzerland/Singapore mutual fund EDW (Enterprise Data Warehouse) projects, coordinated with end user and BA to develop functional and design specifications, drew out the project plan and schedules.
  • Due to deliveries with good QA and high satisfaction from customers, won a star award in April 2005 out of 305 candidates and grew the team from 4 to 11 team members.
  • Modeled data using Coolgen/Erwin and produced the logical and physical models, built up the start schema and snowflake schema using Normalization and Denormalization methodologies.
  • Successfully implemented ETL process with Informatica, Perl, C Shell, C++ and PL/SQL Scripts. Used Autosys and Crontab to schedule the job flow dependency.
  • Designed and built ORACLE Materialized Views (Snapshots) to implement data extraction from central data warehouse EDW to Mutual Fund Data Mart, saved ETL effort by 70%.
  • Used CoolGen to generate COBOL codes to implement complex business logic.
  • Used Cognos Transformer and PowerPlay to build Multidimensional Cubes, alternate drill down and Cube Groups, created the Crosstab, Pie graph and Bar charts etc.

Confidential

Data Warehouse Designer/Developer

Environment: AIX, ORACLE DBMS (8/9i), Ascential DataStage, Mainframe Z/OS, PL/SQL, Store Procedures, SQL*Loader, Erwin, CTRL-M, CTRL-D, C/C++, Crontab, Cognos, TCP/IP, TSO, JCL.

Responsibilities:

  • Analyzed OLTP systems data, defined business requirements, and produced good solutions with Data Warehouse architectural specifications as well as Star Schema and Dimensional Modeling.
  • Designed and successfully implemented the ETL process using DataStage to build PCDW Data Warehouse as well as Marketing and customer relationship data marts.
  • Captured data change from Oracle, DB2 and SQL Server tables and replicated CDC to data warehouse staging area.
  • Wrote Unix Korn scripts to perform various batch jobs including file systems backup, Oracle store procedure execution and flats file formatting and data cleansing.
  • Developed ETL programs using PL/SQL functions, store procedures, triggers, DataStage transformers to execute the initial and incremental data loads.
  • Managed Cognos Impromptu catalog and created Crosstab, List, Summary, Ad Hoc and Sub - query reports for business users.
  • Constructed COBOL codes to read data from and write into IDMS database.

We'd love your feedback!