Etl Analyst Resume
SUMMARY:
- Fifteen years of experience in building Data Warehouse as Data Services Specialist role .
- Many years of experience including Pervasive, Informatica, SSIS, SSAS, SSRS Stack (TFS/Visual Studio) and ProClarity.
- Advanced knowledge and excellent concepts in KPI, SOA Structure, Agile Development style and Tabular method.
- Strong relational database development skills using T - SQL, SQL Server, PL/SQL, Oracle 9.x,/11g. Knew the concepts of Terdata and can keep it up in a couple of days.
- Extensive knowledge in telecommunications departments, finance, and warranty industry projects.
- Exposure to data modeling using ERWin especially with Star and Snow Flake schema.
- Involved in all development stages such as requirement analysis, design, development, and testing on UNIX/Windows platforms.
- Read Hadoop documents and familiarity with HDFS, MapReduce, Sqoop, Hive software, etc…
EMPLOYMENT HISTORY:
Confidential
ETL Analyst
Responsibilities:
- Design and implement SQL structure code that hold business logic and Load heterogeneous sources in AVEKSA and IDP system that hold projects.
- Designed ETL workflow and Netezza scripts to load various source files based on BA requirement specification.
- Cooperated with BA, designed and upgraded the reports data model using PowerDesigner.
- Experienced various SSIS framework loading patterns for handling multiple files loading (includes csv, xls, fixed file)
- Picked up some tables and exported to flat files based on the special business requirement.
- Coordinated with the BA, QA, BI and updated the design and supported QA for implementing test, and conducted the BI to deploy the packages onto the production server.
- Configured the framework tables to prepare landing zones, source file templets and person address where the process status notifications are sent to.
- Wrote the efficient SQL queries with CTE technics to handle the complicated business logics.
- Designed OLAP SSAS cubes with star schema using multiple dimensions, perspectives, hierarchies, measures groups.
- Created Ad-Hoc Reports, Summary Reports, Sub Reports, and Drill-down Reports using SSRS.
- Sent scheduled load plans for daily, weekly, monthly execution actions to Application Analyst for running in Autosys.
- Troubleshoot the package process issues and quickly fixed them so as to support regular loading.
- Implemented the designs on Visual Studio with SSIS language, Netezza SQL scripts, C#, and TFS environment.
- Supported the production running and did troubleshooting and built some reports by Tabluea.
- Read some documents about Big Data structures, Hadoop, MapReduce, Hive, Pig, and etc.
Technical Environment: Agile method, PowerDesinger, SQL Server2012, T-SQL, SSIS ETL Tools (TFS/Visual Studio/C#), Netezza database, Tableau, Excel
Confidential
Database Solution Analyst
Responsibilities:
- Build Payment Dashboard Loading Process.
- Designed the high performance and simple structure of the ETL process based on the system requirement specification of complying with Confidential ’s ETL design standard.
- Coordinated with the BA and updated the design for meeting the client function and non-function requirement.
- Updated the original Cognos’ query code and made it compliance with T-SQL. Loaded and generated the fact, summary, and different dimension tables in SQL Server staging area.
- Configured the load files and invoked the special mechanism and loaded the data in DB2.
- Designed the loading status, monitoring, and error reporting system that will send issue emails to the relative employee on real time, set automatically execution batch with CA7.
- Implemented the design on Visual Studio and TFS platform with SSIS language, its semantics, T-SQL scripts and C#.
- Migrated a complex business Cognos query for SQL Server to Oracle11g.
- Learned and understood the meanings and functions for each query, and replaced the statements that are not supported by SQL Server.
- Did performance tuning for the query.
Technical Environment: SQL Server2005, T-SQL, SSIS ETL Tools (TFS/Visual Studio/C#), TOAD 4.5, Oracle11g, CA7 jobs schedule system, Excel
Confidential, Beaverton, OR
IT Analyst
Responsibilities:
- Coordinated with the customer project manager and collected the client function and non-function requirement.
- Designed the high performance and simple structure of the DJ process including Domestic Wire, International Wire and ACH.
- Configured the document types for the project on Confidential tools and synchronized the setting with the parameters in DJ.
- Coordinated with PM to complete unit testing and integrated testing.
- Analyzed the existed DJs in Confidential DJ code resource and input source files.
- Coordinated with PM, extracted part DJ code about multiple XML files merger function and built the code into core function DJ.
- Set various kinds of parameters and configure files for CRS’ standard core ETL running environment.
- Changed and parameters in DJ and make it suitable for Airgas DJ process.
- Adjusted parts of the code and made them dynamic and flexible in both Implementation and Production environment for Airgas DJ process.
- Coordinated with PM to complete unit testing, integrated testing, configured the Document Type in Confidential tools, and deployed the code onto Production.
- Confidential System daily running support and maintenance
- Experienced in Database Administration on Production Servers with server configuration, performance tuning and maintenance with outstanding troubleshooting capabilities.
- Strong technical background along with the ability to perform business analysis and write effective documentation and specifications.
- Experienced in T-SQL writing stored procedures, triggers and functions.
- Expert in data transformation services and Bulk Loads with SSIS and BULK INSERT ETL operations.
- Designed staging tables and wrote down efficient and quality queries to handle the complicated business logics.
- Monitored and supported Confidential daily running and troubleshot and solved issues occurred in time.
- Wrote the efficient SQL code and did performance tuning.
- Coordinated with MP and Programming team to add the advanced functions for different client based on their requirement.
- Made tables in Excel and grouped up columns to get sum of values that we can compare with the related value in the database to verify the result.
- Analyzed the large and complicated DJ process and did troubleshooting, deployment, backup tasks.
- Migrated the whole system from Data Junction to Microsoft platform including design of SSIS process packages based on the existed logics held in DJ, implementation of process packages with T-SQL and C# code.
- Performed adhoc backups for clients as a baseline or for testing.
- Tested backups to ensure they can be used to restore a database.
- Restored backups from one server to another or copied databases from one server to another.
- Implemented, monitored and tested SQL Server database backup and recovery procedures.
- All the code and packages were developed and stored on Visual Studio and TFS.
- Used Advanced Excel and Access to help finding duplicated rows and other bad data.
- Manipulated data with advanced Excel formulas in order to compare results from databases and verify the result.
- Supported the production running and did troubleshooting.
- Migrated ETL system from Data Junction to SSIS
- Experienced the project SDLC from Identifying the environment, reviewing the requirement document, discussing and deciding SSIS package structure solutions.
- Writing ETL Migration Design documents, confirming the system file naming convention, testing strategy, Implementing development processes and QA test by the agile method.
Technical Environment: SQL Server2012, T-SQL, SSIS, SSRS, SharePoint, Pervasive Data Junction ETL Tools (TFS/Visual Studio/C#), Confidential System, Excel, MS Access
Confidential, Clifton Park, NY
Database System Analyst
Responsibilities:
- Designed and implemented a special map that can upgrade pervasive map capability for treating large data.
- Split a large file into multiple files.
- Designed a special dynamic data structure that handles multiple flat files information.
- Designed a special map that automatically pickup multiple flat files as source based on the data information and generate multiple XML files as target.
- Orange County Court TORE Data Uploading and Maintenance
- Worked effectively in collaboration with the client and other members of the team, proactively found the critical problems of the design solution in the client’s Requirement Design Specification and reasoned the client to accept our solution that simplified the design and cut development time and reduced the support resources after the system delivering that crosses through SDLC by agile method.
- Created Software Design Specification for New Business and AREvent and Tags parts that used agile methodology.
- Design a process staging table that handles all the records that need to be written back to the client Oracle database. The design reduces the communication times with the client’s system.
- Performed installation of new SQL Server instances for building development environment.
- Innovated traditional design solution and designed a special map to handle data from database source and generate XML file directly. The traditional solution is to load data into multiple flat files, merge these files into one flat file and generate XML file by invoking a standard core map. The upgraded method improves performance, reduces two steps and also saves maintenance time.
- Set various kinds of parameters and configure files for CRS’ standard core ETL running environment.
- Redesigned the process workflow and let all parts of data loading in ONE process that make the whole systematic ETL easy to deploy and maintain.
- Designed the Error Reports System and discussed with the client about return content details.
- Reviewed the technical code, designed the test plan, developed the test scenarios, test scripts and proactively addressed any gaps and identify any potential pitfalls.
- Acted as a technical tool and data expert, and mentored the less experienced colleagues in addressing data-related problems of systems integration and the Pervasive special use cases.
- Supported the client with the integrated test and quickly response the client’s emails and troubleshoot the problems with the client together.
- Shared my systematic thinking, design concepts, and advanced function application scenarios for various tools to improve whole team work efficiencies and save costs.
- Supported the production running and did troubleshooting.
- Health Recovery Revenue Group Data Uploading into Titanium System ETL project
- Wrote done the Software Design Specification based on the Software Requirement Specification.
Technical Environment: Oracle11g, SQL Server2008, Pervasive Data Integrator ETL Tools, Titanium System, Visual Source Safe, Excel
Confidential, Fairfax, VA
Data Warehouse Specialist
Responsibilities:
- Implemented the new data marts about Claim for Warranty Analysis, Dispatch and Survey
- Experienced the whole SDLC process including writing design documents, implementation detail of building data warehouse from requirement analysis, dimension data model, ETL processes, building cubes and generating reports.
- Was responsible for building ETL packages using SSIS tasks and T-SQL scripts by Visual Studio and TFS for Data analysis/profiling, Data Cleansing, Data Conversion and Metadata Management.
- Performed extensive business requirements analysis, captured business requirements and wrote detailed Technical Design documents.
- Developed and maintained cubes, Universe Business Objects, data structures, various utilities on both Oracle and SQL Server sides to support the data mart load as well as on-going needs of business owners.
- Wrote high performance MDX codes in the measure calculation for the clients’ special requirement.
- Reviewed technical code, tested scenarios, scripts and proactively and identify any potential pitfalls.
- Managed the storing, collection, and structure of data, period backed up the Data Warehouse and did incremental loading every week as a database administrator role.
- Worked effectively in collaboration with the Business Analysts and other members of the team, proactively found the problems and suggested the implementing methods to improve process efficiencies.
- Added the new features and enhanced the exist functions in the cubes and advised the technology architects on database subjects appropriate for support ongoing development efforts.
- Generated the company fiscal calendar by Excel with VBA Module.
- Created new views and MDX queries in ProClarity Professional based on the client’s requirement.
- Experience in writing expressions in SSRS and expert in fine tuning the reports.
- Did T-SQL tuning and carried out the urgent and critical tasks and met the deadline, like a fire fighter.
- Worked as DBA rule and evaluate whole data mart loading, publishing and the parts of work costs.
- Participated in making and recommended the standards and policies and was compliant with these to meet the growing business needs and to maintain a high level of user service satisfaction.
- Shared my systematic thinking, design concepts, and advanced function application scenarios for various tools to improve whole teamwork efficiencies and save costs.
- Coordinated and cooperated with the team members and other team members, and provided help when they needed.
Technical Environment: Agile method, Oracle9i/10g, TOAD, SQL Server2005, SSAS, SSRS (TFS/Visual Studio/C#), ProClarity6.1, ERWin, JIRA, Visual Source Safe, Excel, SharePoint Technology MDX, DMX, MXL, VBA/ASP.NET
Confidential
ETL Informatica Engineer
Responsibilities:
- Analyzed the functional specs provided by the data architect, viewed and updated data models and created technical specs documents for all the mappings.
- Reviewed source systems, proposed data acquisition strategy, and designed ETL process.
- Extensively used Erwin tool in Forward and reverse engineering, following the Corporate Standards in Naming Conventions, using Conformed dimensions whenever possible.
- Designed and developed Informatica Mappings to load data from Source systems to ODS and Data Mart.
- Extensively used PowerCenter to design multiple mappings with embedded business logic.
- Designed and implemented some highly reused mapplets to encapsulate the business rules, reduced the repeating transformations in ETL mappings, improved maintenance.
- Implemented performance tuning on long- running sessions to reduce loading time from average of 24 hours to less than 9 hours with the highest accuracy.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
- Performance tuning using round robin, hash auto key, Key range partitioning.
- Optimized some legacy ETL mappings by reducing the unnecessary complexity, maintenance and trouble shooting is greatly improved. Released approx 120GB of disk space by dropping unnecessarily logically redundant tables.
- Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell Scripts.
- Migrated Portfolio Data Mart from SQL Server to Oracle
- Migrated Portfolio Data Mart from SQL Server to Oracle by Oracle SQL Developer.
- Updated and tested the stored procedures, function and view query scripts on Oracle.
- Tested and verified the result of the Data Mart and confirmed it marches the original one.
Technical Environment: Oracle9i, Oracle SQL Developer, SQL Server, ErWin4, I nformatica PowerCenter 7.13 , Unix Korn shell