We provide IT Staff Augmentation Services!

Etl/ Business Intelligence Consultant Resume

NY

PROFESSIONAL SUMMARY:

  • I have 5+ years of IT experience focusing on Data warehousing, Data modeling, Data integration, Data Migration, ETL process and Business Intelligence.
  • Package Software: Expertise in Informatica ETL and reporting tools. Deep understanding of the Data Warehousing SDLC and architecture of ETL, reporting and BI tools.
  • ETL: 5 years of Data Warehousing experience using Informatica PowerCenter 8.6/8.4 Excellent skills in implementing ETL Packages using DTS and SSIS.
  • Database / ETL Performance Tuning: Broad Experience in Database Development including effective use of Database objects, SQL Trace, Explain Plan, Types of Optimizers, Hints, Indexes, Table Partitions, Sub Partitions, Materialized Views, Global Temporary tables, Autonomous Transitions, Bulk Binds, Capabilities of using Oracle Built - in Functions. Performance Tuning of Informatica Mapping and workflow.
  • Data Modeling: 5years of Dimensional Data Modeling experience using Erwin 7.3 and Ralph Kimball Approach, Star/Snowflake Modeling, Data marts, OLAP, FACT & Dimensions tables, Physical & Logical data modeling, and Data Modeling Tools Visio.
  • Drawing on Experience in all aspects of analytics/data warehousing solutions (Database issues, Data modeling, Data mapping, ETL Development, metadata management, data migration and reporting solutions) I have been key in delivering innovative database /data warehousing solutions to the Retail, and Finance Industries.
  • Strong understanding of Data Modeling (Relational, dimensional, Star and Snowflake Schema), Data analysis, implementations of Data warehousing using Windows and UNIX.
  • Developed mappings in Informatica to load the data from various sources into the Data warehouse different transformations like Source Qualifier, JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner.
  • Extensive experience in developing stored procedures, functions, Views and Triggers, Complex queries using SQL Server, TSQL and Oracle PL/SQL.
  • Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions
  • SDLC: 5 years of IT experience in system analysis, design, coding and testing. Extensive experience in the Data Warehousing implementations, Data Migration and ETL for Retail, Product Development, Finance and Pharma Domain.
  • Database Development and Analysis: 5 years of experience in Database Development using mainly Oracle database. Experience with Data Cleansing, Data Profiling and Data analysis. UNIX Shell Scripting, SQL and PL/SQL coding.
  • Business Intelligence: 2 years of Business Intelligence experience.
  • Superior communication skills, strong decision making and organizational skills along with outstanding analytical and problem-solving skills to undertake challenging jobs. Able to work well independently and in a team by helping to troubleshoot technology and business-related problems.

TECHNICAL SKILLS:

ETL: informatica 9.x/8.x (Power Center/Power Mart/Power Exchange) (Designer, Workflow Manager, Workflow Monitor, Server Manager, Power Connect).

BI & Reporting: Tableau, SAP Business Objects, Reporting Services SQL Server.

Data Modeling: Erwin 7.3, Logical Modeling, Physical Modeling, Relational Modeling, ER Diagrams, Dimensional Data Modeling (Star Schema Modeling, Snowflake Schema Modeling, FACT and Dimensions Tables), Entities, Attributes, Cardinality, MS Visio

Databases: Oracle 10g/9i, MS SQL Server, Sybase 12.x/11.x SQL Navigator, Teradata, SQL*Loader, MS Access.

Others: C, Unix Shell Scripting, SQL, PL/SQL, ANSI SQL, JAVA, Transact SQL, SQL*Plus, HTML 4.0, ODBC, SSIS, Quest TOAD.

Environment: UNIX (Sun Solaris, HP-UX, IBM AIX ), Windows.

PROFESSIONAL EXPERIENCE:

Confidential, NY

ETL/ Business Intelligence Consultant

Responsibilities:

  • Designed data extraction processes to meet business requirements and data sources.
  • Provided technical expertise and guidance for data management, quality and reporting functions.
  • Implemented processes for extraction and loading of ETL data into data warehouse.
  • Executed procedures for definition and designing of data integration mapping algorithms.
  • Effectively used data blending feature in tableau.
  • Defined best practices for Tableau report development.
  • Created side - by-side bars, Heat Maps and Symbol Maps according to deliverable specifications.
  • Consistently attended meetings with the Client subject matter experts to acquire functional business requirements in order to build SQL queries that would be used in dashboards to satisfy the business's needs.
  • Worked in Tableau environment to create dashboards like weekly, monthly, daily reports using tableau desktop & publish them to server.
  • Created Workbooks and dashboards for analyzing statistical data using Tableau 8.0,8.1
  • Developed formatted, complex reusable formula reports and reports with advanced features such as conditional formatting, built-in/custom functions usage, multiple grouping reports in Tableau.

Environment: Informatica Power Center 9.6/9.1/10.2, Control-M 6.4, Tableau

Confidential, Berkley Heights, New Jersey

ETL Developer

Responsibilities:

  • Working with business users and business analyst for requirements gathering and business analysis.
  • Converted business requirement into high level and low-level design.
  • Designing and customizing data models for Data warehouse supporting data from multiple sources on real time.
  • Extracted data from different flat files, MS Excel, MS Access and transformed the data based on user requirement using Informatica Power Center and loaded data into target, by scheduling the sessions.
  • Worked on Informatica Power Center 8.6 tool - Source Analyzer, Data Warehousing designer, Mapping & Mapplet Designer and Transformation Designer. Developed Informatica mappings and also in tuning of mappings for better performance.
  • Used Normalizer transformation with COBOL sources stored in DB2.
  • Created reusable mapplets and Transformations starting concurrent batch process in server and did backup, recovery and tuning of sessions.
  • Created sequential batches and concurrent batches for sessions.
  • Developed PL/SQL procedures/packages to kick off the SQL Loader control files/procedures to load the data into Oracle.
  • Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Created SSIS Packages to extract data from Excel Files, MS Access files using Pivot Transformation, Fuzzy Lookup, Derived Columns, Condition Split, Term extraction, Aggregate, Execute SQL Task, Data Flow Task, Execute Package Task etc. to generate underlying data for the reports and to export cleaned data from Excel Spreadsheets, Text file, MS Access and CSV files to data warehouse
  • Written SQL Scripts and PL/SQL Scripts to extract data from Database and for Testing Purposes.
  • Supporting daily loads and work with business users to handle rejected data.
  • Developed Interfaces using UNIX Shell Scripts to automate the bulk load & update Processes.
  • Executing test scripts to verify actual results against expected results by using Power Connect for source (DB2) validation and Oracle for target validations.
  • Stored reformatted data from relational, flat file, XML files using Informatica.
  • Implemented data cleansing for files using Informatica and trillium.
  • Done automation of file provisioning process using UNIX, Informatica mappings and Oracle utilities.

Environment: Power Center 8.6.1, Flat files, MS Excel Files, MS Access, SSIS 2008, Oracle 9i/10g, Erwin 7.3, Power Designer, MS SQL Server 2005/2000, PL/SQL, IBM DB2 8.0, Teradata V2R5, Mainframes, Toad, Perl, Unix scripting, Windows NT, Autosys, Microsoft Project Plan.

Confidential, Dallas, TX

ETL Developer

Responsibilities:

  • Working with business users and business analyst for requirements gathering and business analysis.
  • Converted business requirement into high-level and low-level design.
  • Designing and customizing data models for Data warehouse supporting data from multiple sources on real time.
  • Extracted data from different flat files, MS Excel, MS Access and transformed the data based on user requirement using Informatica Power Center and loaded data into target, by scheduling the sessions.
  • Worked on Informatica Power Center 8.6 tool - Source Analyzer, Data Warehousing designer, Mapping & Mapplet Designer and Transformation Designer. Developed Informatica mappings and also in tuning of mappings for better performance.
  • Used Normalizer transformation with COBOL sources stored in DB2.
  • Created reusable mapplets and Transformations starting concurrent batch process in server and did backup, recovery and tuning of sessions.
  • Created sequential batches and concurrent batches for sessions.
  • Developed PL/SQL procedures/packages to kick off the SQL Loader control files/procedures to load the data into Oracle.
  • Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Created SSIS Packages to extract data from Excel Files, MS Access files using Pivot Transformation, Fuzzy Lookup, Derived Columns, Condition Split, Term extraction, Aggregate, Execute SQL Task, Data Flow Task, Execute Package Task etc. to generate underlying data for the reports and to export cleaned data from Excel Spreadsheets, Text file, MS Access and CSV files to data warehouse
  • Written SQL Scripts and PL/SQL Scripts to extract data from Database and for Testing Purposes.
  • Supporting daily loads and work with business users to handle rejected data.
  • Developed Interfaces using UNIX Shell Scripts to automate the bulk load & update Processes.
  • Executing test scripts to verify actual results against expected results by using Power Connect for source (DB2) validation and Oracle for target validations.
  • Stored reformatted data from relational, flat file, XML files using Informatica.
  • Implemented data cleansing for files using Informatica and trillium.
  • Done automation of file provisioning process using UNIX, Informatica mappings and Oracle utilities.

Environment: Power Center 8.6.1, Flat files, MS Excel Files, MS Access, SSIS 2008, Oracle 9i/10g, Erwin 7.3, Power Designer, MS SQL Server 2005/2000, PL/SQL, IBM DB2 8.0, Teradata V2R5, Mainframes, Toad, Perl, Unix scripting, Windows NT, Autosys, Microsoft Project Plan.

Hire Now