We provide IT Staff Augmentation Services!

Sr. Data Analyst/etl Developer Resume

2.00/5 (Submit Your Rating)

Richmond, VA

SUMMARY

  • Over 8+ years of professional experience as a Data Analyst, Configuration Management Analyst, ETL developer and strong expertise in SQL queries, stored procedures, Teradata Macros in IT Industries.
  • Strong Data Warehousing, Data Marts, Data Analysis, Data Organization, Metadata and Data Modeling experience on RDBMS databases.
  • Strong expertise in Banking and finance sector primarily retail, commercial, oil and Gas, Secondary Mortgage and health.
  • Experience in Business Analysis and Data Analysis, User Requirement Gathering, User Requirement Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.
  • Extensive knowledge in Designing, Developing and implementation of the Data marts, Data Structures using Stored Procedures, Functions, Data warehouse tables, views, Materialized Views, Indexes at Database level using PL/SQL, Oracle.
  • Extensively worked with Teradata utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems including flat files.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
  • Excellent hands on experience in relational data modeling.
  • Created views for reporting purpose which involves complex SQL queries with sub - queries, inline views, multi table joins, with clause and outer joins as per the functional needs.
  • Good Working experience in Teradata, Ab Initio, Business Objects, Crystal Reports, PL/SQL, SAS, MS Excel, MS Access.
  • Proficient in Teradata SQL coding, using Teradata BTEQ utility, Working Knowledge on Teradata / Parallel Transport utility/TPUMP. (TPT) coding.
  • Good hands on experience in UNIX shell scripting.
  • Developed and maintained MS Access database.
  • Proficient in coding of optimizedTeradata batch processing scriptsfor data transformation, aggregation and load usingBTEQ.
  • Good Experience on Data archival process to SAS data sets and flat files.
  • Created Datasets using SAS proc sql from flat file, .csv files,, |, etc. through UNIX dev box and Wrote Macros, functions, etc. in SAS 9.2 and SAS 9.3.
  • Strong experience on Base SAS, SAS/Stat, SAS/Access, SAS/Graphs and SAS/Macros, SAS/ODS and SAS/SQL in Windows Environment.
  • Maintain ethical standards with databy ensuring databaseintegrityas well as compliance with legislative, regulatory and accrediting requirements.
  • Experienced in performance tuning and optimization for increasing the efficiency of the scripts on large database for fast data access, conversion and delivery.
  • Experience creating design documentation related to system specifications including user interfaces, security and control, performance requirements and data conversion.
  • In-depth hands on experience in database, ETL/ELT design and development and should have excellent data analysis skills.
  • Extensively Worked in Agile delivery environments and all phases of Software Development Life Cycle (SDLC).
  • Team Player as well as able to work independently with minimum supervision, innovative & efficient, good in debugging and strong desire to keep pace with latest technologies.
  • Excellent Communication and presentation skills along with good experience in communicating and working with various stake holders

TECHNICAL SKILLS

ETL Tools: Informatica, Informatica Powercenter, Data Transformation Studio,SSIS, DataStage, Ab Initio

Data Modeling: Erwin

Databases: Teradata, Oracle, MS SQL Server, MS Access, DB2

OLAP Tools: Micro strategy OLAP Suite, Cognos, Business Objects

Languages: SQL, PL/SQL, Unix Shell Scripts

Operating Systems: Windows, Unix, Sun Solaris, Linux

Testing Tools: HP ALM and Quality Center

Domain: Banking, Finance, Healthcare

Process/Methodologies: Waterfall, Agile Methodology

PROFESSIONAL EXPERIENCE

Confidential, Richmond VA

Sr. Data Analyst/ETL Developer

Responsibilities:

  • Extensively involved in Data Extraction, Transformation and Loading (ETL process) from Source to target systems using Informatica.
  • Worked with Data Stewards and subject matter experts to research reported data anomalies, identified root causes, and determined appropriate solutions.
  • Worked with data source systems and Client systems to identify data issues, data gaps, identified and recommended solutions.
  • Worked on creating SQL queries and performance tuning of queries
  • Worked closely with Architects and developers to deliver the database components for large scale, complex web based applications.
  • Developed and maintained stored procedures. User access maintenance. Implemented changes to database design including tables and views.
  • Assisted developers and customers with ad hoc retrieval of information
  • Worked on data manipulation and analysis & accessed raw data in varied formats with different methods and analyzing and processing data
  • Performed data cleansing and transformation using Informatica.
  • Worked extensively with Designer tools like Source Analyzer, warehouse Designer,Transformation Developer, Mapping and Mapplet Designers.
  • Acted as a liaison between the IT developers and Business stake holders and was instrumental in resolving conflicts between the management and technical teams.
  • Created data mappings to extract data from different source files, transform the data using Filter,Update Strategy, Aggregator, Expression, Joiner Transformations and then loaded into datawarehouse.
  • Implemented Slowly Changing dimension type2 methodology for accessing the full history ofaccounts and transaction information.
  • Developed various mapplets that were then included into the mappings.
  • Used Workflow Manager to read data from sources, and write data to target databases andmanage sessions.
  • Used Update Strategy Transformation to update the Target Dimension tables, type2 updates where we insert the new record and update the old record in the target so we can track the changes in the future.
  • Review of the mappings.
  • Worked with business users for requirement gathering, understanding intent and defining scope and am responsible for project status updates to Business users.
  • Performing analysis and providing summary for the business questions, initiating proactive investigations into data issues that impact reporting, business analysis or program execution.
  • Created views for reporting purpose which involves complex SQL queries with sub-queries, inline views, multi table joins, with clause and outer joins as per the functional needs in the Business Requirements Document (BRD).
  • Used Teradata utilities fastload, multiload, tpump to load data.
  • Assisted in mining data from the SQL database that was used in several significant presentations.
  • Assisted in offering support to other personnel who were required to access and analyze the SQL database.
  • Conducted one-on-one sessions with business users to gather data warehouse requirements..
  • Wrote, tested and implemented Teradata Fastload, Multiload and Bteq scripts, DML and DDL.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Responsible for development of workflow analysis, requirement gathering, data governance, data management and data loading.
  • Have setup data governance touch points with key teams to ensure data issues were addressed promptly.
  • Excellent knowledge on ETL tools such as Informatica, SAP BODS to load data to Teradata by making various connections to load and extract data to and from Teradata efficiently.
  • Conducted data cleaning, manipulation, modification and combination using variety of SAS steps and functions. Analyzed, tested, documented and maintained SAS programs and macros to generate SAS datasets, spreadsheets, data listing, tables and reports.
  • Responsible for generating Financial Business Reports using SAS Business Intelligence tools (SAS/BI) and also developed ad-hoc reports using SAS Enterprise Guide

Environment: Agile, Informatica Power Center 8.6 (Designer, Workflow Manager, Monitor, RepositoryManager), Teradata R12/R13, Teradata SQL Assistant, BTEQ, Fast Load, Fast Export, Multiload, TPUMP, TDWM,, Erwin r7.1,Oracle, Unix Shell Scripts, SQL Server 2005/08, SAS, PROC SQL, MS Office Tools, MS Project, Windows XP, MS Access, Pivot Tables

Confidential, Irving Tx

ETL Developer/Sr. Data Analyst

Responsibilities:

  • Worked on Informatica Designer Tool’s components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer and Mapplet Designer.
  • Involved in the development of Informatica mappings and also tuned for better performance. In the Mappings most of the Transformations were used like the Source Qualifier, Expression, Filter, Aggregator, Look Up, Update, and joiner, Sequence Generator, Sorter, Rank and Router.
  • Used Unconnected Look Ups with the combination of Update Strategy to implement Type 2Warehouse.
  • Lookup tables were implemented at the staging level for the faster response time of Lookups.
  • Extensively used Informatica’s Workflow Manager & Workflow Monitor tools to Load data from MS SQL Server, Oracle OLTP sources into the Target Oracle 9i Data Ware House.
  • Applied event wait and even raise objects along with the email objects from error notification information
  • Data Transformation Studio enables you to import and easily customize prebuilt transformations for industry standards such as EDI, SWIFT, and HIPAA.
  • It also leverages document preprocessors to extract the data locked in PDF files and Microsoft Word and Excel documents and can generate a flexible transformation framework from message specifications authored in Word or Excel
  • Involved in Complete Software Development Lifecycle Experience (SDLC) from Business Analysis to Development, Testing, Deployment and Documentation.
  • Used Teradata utilities fastload, multiload, tpump to load data.
  • Perform Daily validation of Business data reports by querying databases and rerun of missing business events before the close of Business day.
  • Wrote Teradata Macros and used various Teradata analytic functions.
  • Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ.
  • Worked on theTeradata stored proceduresand functions to confirm the data and have load it on the table.
  • Developed procedures to populate the customer data warehouse with transaction data, cycle and monthly summary data, and historical data.
  • Worked on optimizing and tuning theTeradataviews andSQL’sto improve the performance of batch and response time of data for users.
  • Used Model Mart of ERwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Developed logical data models and physical data models using ER-Studio.
  • Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks.
  • Provided initial capacity and growth forecast in terms of Space, CPU for the applications by gathering the details of volumes expected from Business.
  • Prepared low level technical design document and participated inbuild/reviewof theBTEQ Scripts, FastExports, MultiloadsandFast Load scripts, Reviewed Unit Test Plans&System Test cases.
  • Involved in completesoftware development life-cycle(SDLC) includingrequirements gathering, analysis, design, development, testing, implementationanddeployment.
  • Worked on claims data and extracted data from various sources such as flat files, Oracle and Mainframes.
  • Gathered Business requirements by interacting with the business users, defined subject areas for analytical data requirements.
  • Optimizing the complex queries for data retrieval from huge databases.
  • Root cause analysis of data discrepancies between different business system looking at Business rules, data model and provide the analysis to development/bug fix team.
  • Lead the Data Correction and validation process by using data utilities to fix the mismatches between different shared business operating systems.
  • Excellent knowledge on ETL tools such as Informatica, SAP BODS to load data to Teradata by making various connections to load and extract data to and from Teradata efficiently.
  • Conduct downstream analysis for different tables involved in data discrepancies and arriving at a solution to resolve the same.
  • Extensive data mining of different attributes involved in business tables and providing consolidated analysis reports, resolutions on a time to time basis.
  • Performed data analysis and data profiling using complex SQL
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Verifying data quality after every deployment and perform extensive analysis for the data variance pre and post implementation.
  • Worked on data warehouses with sizes from 30-50 Terabytes
  • Utilize data to prepare and conduct quality control with SAS.
  • Conducted data cleaning, manipulation, modification and combination using variety of SAS steps.
  • Developed and maintained MS Access database.
  • Created an MS Access database to collect data systems in order to provide lists, status reports and management overview reports..
  • Analysis of CDE (Critical data elements) in business process, data conversion, data movement and Data integrity check before delivering data to operations, financial analyst for uploading to databases in accordance with IM policy compliance.
  • Extensively involved inUser Acceptance Testing (UAT) and Regression testing.
  • Involved in Data Reconciliation Process while testing loaded data with user reports.
  • Documented all custom and system modification
  • Worked with offshore and other environment teams to support their activities.
  • Responsible for deployment on test environments and supporting business users during User Acceptance testing (UAT).

Environment: DataStage 8.1, Oracle 10g, DB2, Sybase, Teradata SQL Assistant, Erwin r7.1, Informatica Power Center 7.1,, SQL,TOAD, MLOAD, TPUMP, FAST LOAD, FAST EXPORT, TDWM, PMON, DBQL,Cognos 8.0, SQL Server 2008, TSYS Mainframe, SAS PROC SQL, SQL, PL/SQL, ALM/Quality Center 11, QTP 10, UNIX, Shell Scripting, XML, XSLT.

Confidential, San Antonio TX

ETL Developer

Responsibilities:

  • Worked as an ETL Mapping developer.
  • Extensively used various transformations like Aggregator, Lookup, Expression, Router and Filter Transformations.
  • Developed various worklets that were then included into the workflows.
  • Tuned and tested the mapping to perform better using different logic’s to provide maximum efficiency.
  • Review of the mappings.
  • Identified problematic areas and conduct research to determine the best course of action to correct the data.
  • Analyzed problem and solved issues with current and planned systems as they relate to the integration and management of order data.
  • Analyzed reports of data duplicates or other errors to provide ongoing appropriate inter-departmental communication and monthly or daily data reports.
  • Monitor for timely and accurate completion of select data elements.
  • Used Teradata utilities fastload, multiload, tpump to load data.
  • Identify, analyze, and interpret trends or patterns in complex data sets
  • Monitor data dictionary statistics
  • Developed logical and physical data models that capture current state/future state data elements and data flows using Erwin.
  • Extensively used Erwin tool in Forward and reverse engineering, following the Corporate Standards in Naming Conventions, using Conformed dimensions whenever possible
  • Involved in data mapping and data clean up.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Planned project activities for the team based on project timelines using Work Breakdown Structure.
  • Created Technical Design Documents, Unit Test Cases.
  • Involved in Test case/data preparation, execution and verification of the test results
  • Reviewed PL/SQL migration scripts.
  • Coded PL/SQL packages to perform Application Security and batch job scheduling.
  • Created user guidance documentations.
  • Created reconciliation report for validating migrated data.

Environment: Informatica Power Center (Designer, Workflow Manager, Monitor, RepositoryManager),, UNIX, Shell Scripting, XML Files, XSD, XML Spy 2010, SAS PROC SQL, Cognos 8.0, Oracle 10g,Teradata, Sybase, Mercury Quality Center 10, QTP 10, Toad, Autosys.

Confidential, Buffalo NY

Data Analyst/ Teradata Developer

Responsibilities:

  • Provided Data Analysis, gathered Business requirements and Designed and developed various Business Reports
  • Developed various BTEQ scripts to create business logic and process the data.
  • Generated weekly, bi weekly, monthly reports with help Oracle, Teradata, SQL, BTEQ, MS Access, MS Excel, SAS.
  • Worked on loading of data from several flat files sources to Staging using Teradata MLOAD, FLOAD and Tpump.
  • Query optimization (explain plans, collect statistics, Primary and Secondary indexes).
  • Assisted IT developers and non IT developers (Database Analysts) with Teradata Utilities.
  • Large volume of Mainframes data is extracted and converted into SAS DATA sets and then converted into Oracle tables from remote UNIX and Windows environments by writing the Base SAS programs
  • Converted the flat files like text files, excel files into a SAS data sets by writing the DATA step programs in SAS Base
  • Monitoring the existing code performance and to change the code for better performance.
  • Performed data validations, Data integrity before delivering data to operations, financial analyst.
  • Experience in database monitoring and data stewardship.
  • Identifying root cause problems and providing solutions..
  • Developed SQL queries for Extracting data from production database and built data structures, reports.
  • Designed, Developed Ad-hoc reports as per business analyst, operation analyst, project manager data request.
  • Performed various data pull from Teradata One View Data warehouse using SQL Assistant and Bteq.
  • The SQL Assistant has been used to query the database objects to validate the data quality.
  • Designed and Developed Adhoc reports by extracting data from Oracle PCARD Data warehouse using SQL Plus.
  • Used Fast Exports to generate Flat files after the Change data capture has been accomplished which in turn creates a loadable file that is used to load the database
  • Analyzed data by pulling data from data warehouse into MS Excel and build graphs

Environment: Teradata, SQL Developer, NCR Teradata utilities (BTEQ, FastLoad, Fast Export) 6.01, Oracle, SQL, PL/SQL, DB2,UNIX, Windows 2000/NT UNIX, SAS 9.1.3, SQL Assistant, SAS PROC SQL

Confidential, Memphis TN

Business/Data Analyst/ Teradata Developer

Responsibilities:

  • Built the Forms for application using various form modules and windows.
  • Used Teradata utilities fastload, multiload, tpump to load data.
  • Perform Daily validation of Business data reports by querying databases and rerun of missing business events before the close of Business day.
  • Wrote Teradata Macros and used various Teradata analytic functions.
  • Good knowledge on Teradata Manager, TDWM, PMON, DBQL, SQL assistant and BTEQ.
  • Worked on theTeradata stored proceduresand functions to confirm the data and have load it on the table.
  • Worked on theTeradata stored proceduresand functions to confirm the data and have load it on the table
  • Participated in gathering Business Requirements and designed the project according to the Business requirement specifications.
  • Written PERL scripts and UNIX Shell Scripts for various data comparison and data parsing needs.
  • Conducted Bug Review meetings for update on defects from development team and retesting of bug fix. Worked with developers to fix faults found in the structure and functionality of the application.
  • Prepared daily/weekly bug status reports highlighting bug fix metrics and tracked the progress of test cycles in Quality Center
  • Conducted Training & Knowledge Transfer Sessions on new applications to QA Analysts
  • Expertise in code reviews. Expertise in security issues of client/server technologies.
  • Designed the screens and layout using Visual Basic 5.0.
  • Used SQL*Plus for updating the database and for data retrieval.
  • Performed unit testing, manual testing of the processes to check the integrity.
  • Involved in the implementation, testing and maintenance of the system.
  • Run batch jobs for loading database tables from flat files using SQL*Loader
  • Developed UNIX Shell scripts to automate repetitive database processes
  • Used Clear Quest for defect tracking and reporting, updating the bug status and discussed with developers to resolve the bugs.

Environment: Oracle, SQL*Plus, SQL, Test Director, TOAD, Rational Clear Quest, Clear Case, SQL Server 2000, Teradata SQL Assistant, PL/SQL, Visual Basic 6.0, XML, XSLT, Java and UNIX, Shell scripting.

We'd love your feedback!