We provide IT Staff Augmentation Services!

Data Analyst/etl Developer Resume

5.00/5 (Submit Your Rating)

Peoria, IL

SUMMARY

  • Over 8 years of IT experience with Business and Data Analysis, ETL Development, Data Modeling, Data Profiling, Data Migration, Data Integration and Metadata Management Services.
  • More than 6 years of Data Warehousing experience using Informatica Power center 9.x,8.x,7.x,6.x,5.x, Power Connect, Power Exchange, Warehouse Designer, Power Analyzer, ETL, Data Mart, OLAP, OLTP, Star Schema, Snowflake Schema.
  • Worked on various domains like Banking, Finance, Insurance and Health Care, Manufacturing.
  • Strong experience on gathering business requirements from business/user, source to target mapping, creating source to target mapping document, creating process flows, and DFD.
  • Excellent understanding of Data Warehousing Concepts - Star and Snowflake schema, SCD Type1/Type2, Normalization/De-Normalization, Dimension & Fact tables.
  • Proficient experience as a Data Analyst in gathering data from different sources, data profiling, data definition, and loaded the data to business warehouse.
  • Experience in conductingJoint Application Development (JAD)sessions for requirements gathering, analysis, design and Rapid Application Development (RAD) sessions to converge early toward a design acceptable to the customer and feasible for the developers and to limit a project’s exposure to the forces of change.
  • Strong experience in Data Analysis, Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export using multiple ETL tools such as Informatica Power Center and Data Stage.
  • Develop project management plans, identify key milestones, track metrics and managed stakeholders throughout the lifecycle of multiple projects to ensure adherence to project schedules & budgets
  • Worked with heterogeneous relational databases such as Teradata, Oracle, MS Access and SQL Server.
  • Experience in Other Utilities/Tools/Scripts like Korn Shell Scripting, SQL Plus, SQL Loader, Export and Import utilities, TOAD 9.1 and Visio 10.0.
  • Experience in using IDQ Tool for Profiling, applying Business rules and developed mappings to move data from source to Target systems
  • Experienced in using IDQ tool for applying rules and develop mappings to move data from source to target systems
  • Applied the rules and profiled the source and target table's data using IDQ
  • Technical expertise in ETL methodologies, Informatica 6.x/7.x/8.x/9.x - Power Center, Power Mart, Client tools - Mapping Designer, Workflow Manager/Monitor and Server tools - Informatica Server Manager, Repository Server Manager, and Power Exchange.
  • Experience with Relational Databases like Oracle 10g/9i/8i, DB2 ESE, MS SQL Server 2005/2010, DTS, Sybase, Teradata V2R5, MS Access and formats like flat-files, CSV files, COBOL files and XML files
  • Design project plans and teams to deliver on-time, on-budget and on-scope results within risk tolerances without undue project management overhead.
  • Expertise in broad range of technologies, including business process tools such as Microsoft Project, MS Excel, MS Access, MS Visio, technical assessment tools, MicroStrategy, Data Warehouse, Data Modeling and Design
  • Worked on Big Data Ingestion, Reporting using Spark, Scala, Java, Pig, Hive, Sqoop, Oozie, Python, Kafka and UNIX.
  • Experience in complete Software Development Life Cycle (SDLC), Software Testing Life Cycle (STLC) and Bug Life Cycle (BLC).
  • Created standards for collection and management of metadata.
  • Perform Grant, Revoke table access to Database Analysts, Business Analyst, Operational Analysts and Project Manager.
  • Experience in testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
  • Excellent experience inData miningwithquerying and mining large datasetsto discover transition patterns and examine financial data.
  • Created various dashboards using Tableau, Excel, and Access with a focus on user interface and simple data consumption
  • Experience in Business Intelligence tools likeBusiness Objects, Cognos and OBIEE
  • Involved in creating Dashboards, reports as needed using Tableau Desktop and Tableau Server.
  • Handling all the domain and technical interaction with application users, analyzing client business processes, documenting business requirements.
  • Well versed with Data Migration, Data Conversions, Data Extraction/ Transformation/Loading (ETL) using DTS, PL\SQL Scripts
  • Expertise in source to target mapping in Enterprise and Corporate Data Warehouse environments.
  • Implemented Slowly Changing Dimensions Type 1, Type 2, and Type 3 methodology for accessing the full history of accounts and transaction information.
  • Experienced to work as a Data Analyst to perform complex Data Profiling, Data definition, Data Mining, validating and analyzing data and presenting reports.
  • Ability to handle and interact with Business users, collect and analyze business / functional requirements.
  • Excellent communication, documentation and presentation skills with clear understanding of business process flow.

TECHNICAL SKILLS

Databases: Oracle 11g/10g/9i/8i, SQL Server 2005/2008IBM DB2, Teradata, MS Access

Data Warehousing: ETL Informatica Power Center 9.x/8.x/7.x (Transformation Developer, Mapplet Designer, Mapping Designer, Repository Manager, Workflow Manager, Workflow Monitor and Informatica Server) Data Mart, FACT and Dimensions tables, Physical & logical data Modeling.

Data Modeling Tools: ERWIN, ER Studio, MS Visio 2003, Power Designer

Database Tools: SQL*PLUS, SQL*Loader, Teradata 5.0, SQL Developer, TOAD 9.5

Reporting Tools: OBIEE, Crystal Reports 8.0/9.0/10, Business Object 6.5/XI, Cognos

Languages: C, C++, Pro*C, COBOL, SQL, PL/SQL, T-SQL, UNIX Shell Scripting, PERL, Python

Environment: Linux, HP-UX, Windows2000/NT, Windows XP/Vista/7

PROFESSIONAL EXPERIENCE

Confidential, Peoria, IL

Data Analyst/ETL Developer

Responsibilities:

  • Worked with business requirements analysts/subject matter experts to identify and understand requirements. Conducted user interviews and data analysis review meetings.
  • Defined key facts and dimensions necessary to support the business requirements along with Data Modeler.
  • Created draft data models for understanding and to help Data Modeler.
  • Resolved the data related issues such as: assessing data quality, data consolidation, evaluating existing data sources.
  • Manipulating, cleansing & processing data using Excel, Access and SQL.
  • Performed Data Validations using SQL developer.
  • Responsible for loading, extracting and validation of client data.
  • Project Management, Requirements Analysis Software development lifecycle, full lifecycle documentation, database
  • Worked closely with Data Architect to review all the conceptual, logical and physical database design models with respect to functions, definition, maintenance review and support Data analysis, Data Quality and ETL design that feeds the logical data models.
  • Working with clients to understand their data migration needs and determine any data gaps.
  • Design project plans and teams to deliver on-time, on-budget and on-scope results within risk tolerances without undue project management overhead.
  • Developed several IDQ complex mappings in Informatica a variety of Power Center
  • Analyzed the source data coming from various data sources like Mainframe & Oracle.
  • Created data mapping documents mapping Logical Data Elements to Physical Data Elements and Source Data Elements to Destination Data Elements.
  • Excellent understanding/knowledge of Hadoop Distributed system architecture and design principles.
  • Worked on Big Data Ingestion, Reporting using Spark, Scala, Java, Pig, Hive, Sqoop, Oozie, Python, Kafka and UNIX.
  • Created and analyzed all the phases of the project management processes.
  • Audit files and metadata entries for completion and proper organization.
  • Created metadata reports using MS Excel.
  • Applied the rules and profiled source and target data by using IDQ.
  • Created new processes to shorten time of migrations from 4 weeks to 2 weeks and increase reusability of ETL templates.
  • Extracted and managed metadata.
  • Well versed with Data Migration, Data Conversions, Data Extraction/ Transformation/Loading (ETL) using DTS, PL\SQL Scripts
  • Managed, updated and manipulated report orientation and structures with the use of advanced Excel functions including Pivot Tables and V-Lookups.
  • Worked with various IDQ transformations like Standardizer, Match, Association, Parser, Weighted Average, Comparison, Consolidation, Decision, Expression
  • Tested the data using the Logs generated after loading the data in to Data warehouse.
  • Prepared Traceability Matrix with requirements versus test cases.
  • Worked on Master Data Management (MDM) for maintaining the customer information and also for the ETL rules to be applied.
  • Involved working on different set of layers of Business Intelligence Infrastructure.
  • Worked extensively in Data consolidation and harmonization.
  • Worked extensively on the T-SQL environment to run out the queries and explore the Databases.
  • Worked with OLAP tools such as ETL, Data warehousing and Modeling.
  • Used Informatica / SSIS to extract, transform & load data from SQL Server to Oracle databases.
  • Meeting with user groups to analyze requirements and proposed changes in design and specifications.
  • Performed Detailed Data Analysis (DDA), Data Quality Analysis (DQA) and Data Profiling on source data.

Environment: Informatica 9.x, Oracle 11g, SQL Server 2008, SSIS 2008, SPSS, PL/SQL, Erwin, TOAD, MS Excel,MS Access, Visio, Python, Linux

Confidential, San Francisco, CA

ETL Data Analyst

Responsibilities:

  • Participated in user meetings, gathered Business requirements & specifications for the Data-warehouse design. Translated the user inputs into ETL design docs.
  • Defined, and documented the technical architecture of the Data Warehouse, including the physical components and their functionality.
  • Created Star schema dimension modeling for the data mart using Visio and created dimension and fact tables based on the business requirement.
  • Analyze and gather user requirements and create necessary documentation of their data migration
  • Designed ETL architecture to process large number of files and created High-level design, low-level design documents.
  • Develop project management plans, identify key milestones, track metrics and managed stakeholders throughout the lifecycle of multiple projects to ensure adherence to project schedules & budgets
  • Used analytic skills to detect debit and credit card fraud in real time.
  • Responsible for implementing the Informatica/Teradata CDC logic to Process the delta data.
  • Design reference data and data quality rules using IDQ and involved in cleaning the data using IDQ in Informatica Data Quality 9.1 environment.
  • Verifying information with customers and directing them to their next steps.
  • Extensively worked on flat files, mainframe files and involved in creation of UNIX shell scripts using different shell scripts for FTP, generating list files for loading multiple files and in archiving the files after the completion of loads.
  • Work alongside clients to develop strategies for migration of their business data across platforms utilizing Microsoft SQL Server
  • Strong Experience in Hadoop file system and Hive commands for data mappings.
  • Worked with OLAP tools such as ETL, Data warehousing and Modeling.
  • Created new strategies for preventing fraud.
  • Used Informatica to extract, transform & load data from SQL Server to Oracle databases.
  • Involved in the creation of Informatica mappings to extracting data from oracle, Flat Files to load in to Stage area.
  • Strong expertise in using ETL Tool Informatica Power Center 8.x/9.6.1 (Designer, Workflow Manager, Repository Manager), Informatica Cloud, Informatica MDM 10.1, Informatica Data Quality (IDQ) 9.6.1 and ETL concepts.
  • Worked data mapping, data cleansing, program development for loads, and data verification of converted data to legacy data.
  • Assisted with documentation creation for a new fraud system to be implemented.
  • Worked on Master Data Management (MDM) for maintaining the customer information and also for the ETL rules to be applied
  • Skilled at implementing project management processes
  • Created MS Excel reports for metadata extracts using MS Excel pivot tables.
  • Developed SQL Server views for generating metadata reports.
  • Building, publishing customized interactive reports and dashboards, report scheduling using Tableau server.
  • Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
  • Created Fast Export, MultiLoad, Fast Load UNIX script files for batch Processing.
  • Designed and Created data cleansing, data conversion, validation and External loading scripts like MLOAD and FLOAD for Teradata warehouse using Informatica ETL tool.
  • Involved in error handling, performance tuning of mappings, testing of Stored Procedures and Functions, Testing of Informatica Sessions, and the Target Data.
  • Transformed the existing ETL logic to Hadoop mappings.
  • Experienced in loading data into Data Warehouse/Data Marts using Informatica, Teradata Multiload, Fast load and BTEQ utilities.
  • Extensively worked on indexes in Teradata and Creation of proper Primary Index (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Worked with DBA in making enhancements to physical DB schema. Also, coordinated with DBA in creating and managing table, indexes, table spaces, triggers, DB links and privileges.
  • Possess expertise with relational database concepts, stored procedures, functions, triggers and scalability analysis. Optimized the SQL and PL/SQL queries by using different tuning techniques like using hints, parallel processing and optimization rules.
  • Responsible for SQL tuning and optimization using Analyze, Explain Plan and optimizer hints.
  • Extensively worked on maintaining the up to date statistics of the tables by using explain plans. Extensively worked in Skew Factor, resource usage of the queries using query log data.
  • Supported application development timelines by implementing designs as well as incremental changes to database definition in a timely manner into production and non-production Teradata systems.
  • Maintain and Tune Teradata Production and Development systems.
  • Involved in migration of the mapping from IDQ to Power center
  • Perform tuning and optimization of database configuration and application SQL.
  • Loading data by using the Teradata loader connection, writing Teradata utilities scripts (Fast load, Multiload) and working with loader logs. Scheduled workflows, BTEQ Scripts, UNIX shell scripts using WLM Scheduling tool.
  • Supported Business Readiness team on planning and deployment of Collibra tasks.
  • Performed Detailed Data Analysis (DDA), Data Quality Analysis (DQA) and Data Profiling on source data.
  • Responsible for migrations of the code from Development environment to QA and QA to Production.
  • Documented the existing mappings as per the design standards followed in the project.
  • Prepared the validation scripts to compare the new data with legacy systems.
  • Carry out Defect Analysis and fixing of bugs raised by the Users
  • Highly motivated with the ability to work effectively in teams as well as independently

Environment: Informatica Power Center 9.x, Oracle 11g, SQL Server 2008, SPSS, PL/SQL, Teradata, UNIX Shell Scripting, Visual Studio, Erwin, Business Objects, Tableau, MS Excel, MS Access, Windows XP.

Confidential, Rochester, MN

Data Analyst/ETL Developer

Responsibilities:

  • Processed large Excel source data feeds for Global Function Allocations and loaded the CSV files into Oracle Database with SQL Loader utility.
  • Performed code Inspection and moved the code into Production Release.
  • Documented all the Relative activities in Quality Centre and coordinated with QA team.
  • Performed Data filtering, Dissemination activities, trouble shooting of database activities, diagnosed the bugs and logged them in version control tool.
  • Analyzed adversary activity using SIGINT metadata.
  • Project Management, Requirements Analysis Software development lifecycle, full lifecycle documentation, database
  • Performed source data analysis and captured metadata, reviewed results with business. Corrected data anomalies as per business recommendation.
  • Involved in migration of the mapping from IDQ to Power center
  • Extensive hands on experience in Hadoop file system commands for file handling operations.
  • Multiple work stream project management
  • Worked in converting Hive/SQL queries into Spark transformations using Spark RDDs, Python and Scala.
  • Created adhoc reports to users in Tableau by connecting various data sources.
  • Created source to target mappings for multiple source from SQL server to oracle.
  • Used excel sheet, flat files, CSV files to generated Tableau adhoc reports.
  • Performed the batch processing of data, designed the SQL scripts, control files, batch file for data loading.
  • Performed the Data Accuracy, Data Analysis, Data Quality checks before and after loading the data.
  • Coordinated with the Business Analyst Team for requirement gathering and Allocation Process Methodology, designed the filters for processing the Data.
  • Designed, developed, implemented and maintained Informatica Power Center and Informatica Data Quality (IDQ) application for matching and merging process.
  • Designed and developed the Database objects (Tables, Materialized Views, Stored procedures, Indexes), SQL statements for executing the Allocation Methodology and creating the OP table, CSV, Text files for business.
  • Maintained a metadata repository which included administration of Data Stewards, and ETL data.
  • Performed the physical database design, normalized the tables, worked with Denormalized tables to load the data into fact tables of Data warehouse.
  • Designed and developed Dimensional Model and ETL process using informatica.
  • Developed SQL Joins, SQL queries, tuned SQL, views, test tables, scripts in development environment.
  • Used SQL*Loader to load data from external system and developed PL/SQL programs to dump the data from staging tables into base tables.
  • Developed data migration and data validation scripts from old to the new system
  • Extensively wrote SQL Queries (Sub queries, correlated sub queries and Join conditions) for Data Accuracy, Data Analysis and Data Extraction needs.
  • Created mapping documents using the Metadata extracted from the Metadata repositories.
  • Resolved the data type inconsistencies between the source systems and the target system using the Mapping Documents.
  • Designed star schema with dimensional modeling, created fact tables and dimensional tables.
  • Involved in data analysis, data discrepancy reduction in the source and target schemas.
  • Designed and developed the star schema Data model, Fact Tables to load the Data into Data Warehouse.
  • Implemented one-many, many-many Entity relationships in the data modeling of Data warehouse.
  • Analyzed the tables once the imports were done.
  • Edited the Database SQL files with Vi Editor, used Unix Commands like Job scheduling, Executing files, Process, Background, Grep etc.
  • Written PERL Scripts to extract the data from Oracle Database, checking the database connections, regular expressions etc.
  • Developed the VBA Integration with Excel feeds and SQL database, SQL Extraction, Transformations of Excel Data into SQL database.
  • Involved in writing Oracle PL/SQL procedures, functions, Korn Shell scripts that were used for staging, transformation and loading of the data into base tables.
  • Involved in debugging and trouble-shooting of database objects.
  • Involved in Data loading and data migration - Used SQL * Loader to load data from excel file into staging table, Data Cubes and developed PL/SQL procedures to load data from staging table into base tables of Data warehouse.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, address standardization, exception handling, and reporting and monitoring capabilities of IDQ.
  • TOAD, SQL Developer tools were used to develop programs for executing the queries.
  • Worked with SQL Query performance issues. Used index logic to obtain the good performance.
  • Created the XML control files to upload the data into Data warehousing system.

Environment: Oracle 11g, Tableau, Data Warehouse, OLAP, SQL Navigator, SPSS, Visual Studio 2010, SQL Developer, Erwin 4.0, XML, OLTP, MS-Excel 2000, MS-office 2000, Microsoft XP Professional.

Confidential, Minnesota, MN

Oracle/ETL developer

Responsibilities:

  • The business design work involved in establishing the reporting layouts for various reports and the frequency of report generation.
  • Identifying the information needs within and across functional areas of the organization.
  • Modeling the process in the enterprise wide scenario.
  • Field mapping work involved establishing relationships between the databases Tables, filter criteria, formulas etc., needed for the reports.
  • Managed database optimization and table-space fragmentation.
  • Involved in fullSoftware Development Lifecycle (SDLC)
  • Responsible for developing, implementing, and testing data migration strategy for overall project in database using SQL 2012 as platform with global resources.
  • Developed database objects includingtables, Indexes, views, sequences, packages, triggers and proceduresto troubleshoot any database problems
  • Worked on Informatica Power Center 8.6 tool - Source Analyzer, Data Warehousing designer, Mapping & Mapplet Designer and Transformation Designer.
  • Developed Informatica mappings and tuning of mappings for better performance.
  • Extracted data from different flat files, MS Excel, MS Access and transformed the data based on user requirement using Informatica Power Center and loaded data into target, by scheduling the sessions.
  • Created different source definitions to extract data from flat files and relational tables for Data mart.
  • Used the dynamic SQL to perform some pre-and post-session task required while performing Extraction, transformation and loading.
  • Tuned the performance of queries by working intensively over indexes.
  • Created reusable mapplets and Transformations starting concurrent batch process in server and did backup, recovery and tuning of sessions.
  • Created, modified, deployed, optimized, and maintained Business Objects Universes using Designer.
  • Designing the ETL process using Informatica to populate the Data Mart using the flat files to Oracle database
  • Created complex mappings to populate the data in the target with the required information.
  • Created work flows and sessions to perform the required transformation.
  • Perform ETL Testing by validating data.
  • Created SSIS Packages to extract data from Excel Files, MS Access files using Pivot Transformation, Fuzzy Lookup, Derived Columns, Condition Split, Term extraction, Aggregate, Execute SQL Task, Data Flow Task, Execute Package Task etc., to generate underlying data for the reports and to export cleaned data from Excel Spreadsheets, Text file, MS Access and CSV files to data warehouse
  • Writing UNIX shell scripts to automate the processes.
  • Written SQL Scripts and PL/SQL Scripts to extract data from Database and for Testing Purposes.
  • Performed testing and QA role: Developed Test Plan, Test Scenarios and wrote SQL plus Test Scripts for execution on converted data to ensure correct ETL data transformations and controls.
  • Done automation of file provisioning process using UNIX, Informatica mappings and Oracle utilities.

Environment: Informatica Power Center 8.6.1, Flat files, MS Excel Files, MS Access, SSIS 2008, Oracle 9i/10g, Erwin 7.3, Power Designer, MS SQL Server 2005/2000, PL/SQL, Teradata V2R5, Mainframes, Toad, Perl, Unix scripting, Windows NT, Autosys, Business Objects

Confidential

Data Analyst

Responsibilities:

  • Responsibilities include gathering business requirements, developing strategy for data cleansing and data migration, writing functional and technical specifications, creating source to target mapping, designing data profiling and data validation jobs in Informatica, and creating ETL jobs in Informatica.
  • Designed the data marts using the Ralph Kimball’s Dimensional Data Mart modeling techniques
  • Gathered all the Sales Analysis report prototypes from the business analysts belonging to different Business units
  • Designed, developed and tested data mart prototype (SQL 2005), ETL process (SSIS) and OLAP cube (SSAS)
  • Involved in Data Extraction, Transformation and Loading (ETL) from source systems
  • Responsible with ETL design (identifying the source systems, designing source to target relationships, data cleansing, data quality, creating source specifications, ETL design documents, ETL development (following Velocity best practices).
  • The data received from Legacy Systems of customer information were cleansed and then Transformed into staging tables and target tables in DB2
  • Used External Tables to Transform and load data from Legacy systems into Target tables
  • Use of data transformation tools such as DTS, SSIS, Informatica or Data Stage.
  • Conducted Design reviews with the business analysts, content developers and DBAs
  • Designed, developed, and maintained Enterprise Data Architecture for enterprise data management including business intelligence systems, data governance, data quality, enterprise metadata tools, data modeling, data integration, operational data stores, data marts, data warehouses, and data standards.
  • Incremental loading of Fact table from the source system to Staging Table done on daily basis
  • Coding SQL stored procedures and triggers.
  • Designed different types of STAR schemas like detailed data marts and Plan data marts
  • Monthly Summary data marts, Inventory data marts using Erwin (with various Dimensions Like Time, Services, Customers, Sales Hierarchy, Orders Snow Flake Dimensions and various FACT Tables)
  • Worked on data modelling and produced data mapping and data definition documentation
  • Worked on SQL Server 2005 concepts SSIS (SQL Server Integration Services), SSAS (Analysis Services) and SSRS (Reporting Services).
  • Designed a reporting table approach to get better performance of complex queries and Views.
  • Data maintenance activities like REORG, RUNSTATS, LOAD, UNLOAD, IMPORT and EXPORT.
  • Performance & tuning and installation of DB2 on Linux.
  • Performance tuning was an ongoing process, tuning queries.
  • Monitoring and providing 24x7 supports for the production database.

Environment: Linux, IBM Web sphere 6.0, Erwin, SQL Server 2000/2005, Crystal Reports 9.0, HTML, Data Stage Version 7.0, Oracle, Toad, MS Excel, PowerPoint.

We'd love your feedback!