We provide IT Staff Augmentation Services!

Data Warehouse Consultant Resume

5.00/5 (Submit Your Rating)

Fort Lauderdale, FloridA

SUMMARY

  • Around 8 years of IT and consulting experience focused on ETL, Data Quality, Data Discovery, EDW, Big Data, Analytics and Visualization using Informatica, Oracle, Teradata, HIVE, Microstrategy and Tableau across various industries such as Banking, Finance and Credit Cards.
  • Expertise in Informatica products (PowerCenter, PowerExchange, B2B, IDQ, Address Doctor).
  • Extensive experience in using Informatica Power Center 9.x/8.x to carry out the Extraction, Transformation and Loading process as well as Administering in creating domains, repositories and folder.
  • Specialized in relational database technologies like Oracle SQL and Teradata.
  • Worked with business in identifying the requirement of the project and how effectively it can be implemented in the platform. I’ve been associated with more that 80 deliverables and providing signoffs starting from design review to code review and test report review.
  • Designed and developed ETL solutions to load data from various sources / targets like RDBMS, Flat files, XML files, VSAM, MQ and web services (SOAP).
  • Experience in all the phases of data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, Deployment and Operations.
  • Extensive knowledge in Data Modeling (Logical data model and Physical data model), Dimensional Modeling (STAR and Snowflake Schema).
  • Hands on experience on several key areas of Enterprise Data Warehousing such as Change Data Capture (CDC), Slowly Changing Dimensions (SCD Type I and Type II).
  • Expertise in creating very detailed design documents and performing Proof of Concepts (POC).
  • UNIX shell scripts for automating batch programs. Worked with scheduling tools like Control - M and AutoSys.
  • Hands on experience with mappings from varied transformation logics like Expressions, Filter, Ranks, Unconnected and Connected Lookups, Router, Aggregator, Normalizer, Transaction Control, Joiner, Update Strategy, Web Consumer, Stored Proc etc.
  • Worked on Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
  • Experience in troubleshooting methodology, using the debugger, Workflow and Session logs to diagnose errors, recognize & repair Connection Errors & Network errors. Properly configured workflow & sessions for recovery, high availability of the ETL environment.
  • Expertise in defining and documenting ETL Process Flow, Job Execution Sequence, Job Scheduling and Alerting Mechanisms using command line utilities.
  • Worked on Data Profiling using IDQ-Informatica Data Quality to examine different patterns of source data. Proficient in developing Informatica IDQ transformations like Parser, Classifier, Standardizer and Decision.
  • Experience with creating profiles, rules, scorecards for data profiling and quality using IDQ.
  • Designed Mappings using B2B Data Transformation Studio.
  • Strong working experience in production support by receiving production calls, bug identification & fixing, maintenance phases and have delivered on time resolution according to client and customer requirements.
  • Experience in Microstrategy and Tableau to design Reports / Dashboards / Visualization based on the requirements by the end user.
  • Experience working in agile methodology and ability to manage change effectively.
  • Responsible for Team Delivery and participated in Design Reviews.
  • Experience in coordinating with cross-functional teams and project management.
  • Excellent communication, interpersonal skills, analytical reasoning and quickly assimilates latest technologies concepts and ideas.

TECHNICAL SKILLS

ETL Tools: Informatica PowerCenter 9.6.1, 9.1, 8.5, Informatica Power Exchange, Informatica Analyst 9x, Informatica Developer 9x, Informatica B2B, Sqoop, Datastage 8.5, SSIS.

Languages: PL/SQL, Python, J2EE, HTML

Databases: Oracle 11g/10g/9i, Teradata 14, Hive, Microsoft SQL Server 2008

Scripting: Shell scripting

Other Tools: Tableau, Microstrategy, SSRS, OBIEE, BO, SQL Developer, Hue, Toad, BMC Control-M, Austosys, Putty, Erwin, Visio, JIRA, Rally, MS Office Packages

PROFESSIONAL EXPERIENCE

Confidential, Fort Lauderdale, Florida

Data Warehouse Consultant

Responsibilities:

  • Performing requirement gathering, analysis, design, development, testing, implementation, support and maintenance phasesof both Amex-B2B and STAR warehouse application.
  • Experience in developing and deploying full lifecycle project using Agile Development Methodology.
  • Involved in creating Logical and Physical Data Models using ERWIN Data modeling.
  • Assisted in dimensional modeling to design dimension and fact tables.
  • Worked on designing and coding the complete ETL process using Informatica for various transactions and loading data from different sources like Flat Files, VSAM, Web Services, MQ and Relational Database.
  • Followed the Informatica Best Practices and Standards while creating Informatica Mappings, Sessions and Workflows.
  • Created Informatica Mappings to build business rules to load data using transformations like Source Qualifier, Aggregator, Expression, Joiner, Connected and Unconnected lookups, Filters, Router and Update strategy.
  • Extensively worked on Mapping Variables, Mapping Parameters and Session Parameters.
  • Manipulated code page for source and targets to meet the requirement.
  • Worked extensively on parameter file in defining mapping, session and environmental variables.
  • Created mappings to perform integrity and balancing checks before the target files were sent out to third parties.
  • Worked extensively on flat files: flat files import wizard, target load order, indirect files.
  • Worked on tuning sessions performance for data warehouse loads: configuring caches, buffer memories, partitioning sessions, target commit interval, pushdown optimization etc.
  • Worked with Teradata utilities like Teradata Parallel Transporter (TPT), BTEQ, Fast Load, Multi Load, and Tpump. Used Teradata Fast Export utility to import data from Teradata database wherever necessary.
  • Worked with worktables, log tables, error tables in Teradata.
  • Worked on importing and exporting data from Oracle and Teradata into HDFS and HIVE using Sqoop.
  • Used Hue version- 2.6.1-2 Web interface to query the data.
  • Developed UNIX shell scripts for running batch jobs and scheduling them, also for automating and enhancing/streamlining existing manual procedures.
  • Worked on tuning complex SQL queries and complex SQL overrides to join heterogeneous databases.
  • Involved in address parsing by using Address doctor to avoid incorrect address being sent in Canada market.
  • Traversed issues in verbose mode, whenever required to drill down to the depth of the issue.
  • Worked extensively on re-usability of codes.
  • Worked with extensive integration and transformation of unstructured data using B2B data exchange.
  • Extensively worked on data profiling and quality for data cleansing, data matching and data conversion.
  • Worked on IDQ transformations like Parser, Classifier, Standardizer and Decision.
  • Used PowerExchange connectors to connect to multiple source system of the Loyalty platform, to bring in data and load to a staging area.
  • Identified and eliminated duplicates in datasets through IDQ components
  • Used Constraint Based loading & target load ordering to load data into the tables while maintaining the PK-FK relation in the same mapping.
  • Extensively used various Data Cleansing and Data Conversion functions like LTRIM, RTRIM, TO DATE, Decode, and IIF functions in Expression Transformation.
  • Take responsibility of overall Design and development and Code drop to testing environment.
  • Involved in debugging of the mappings and unit testing, prepared test cases.
  • Troubleshot the issues by checking sessions and workflow logs.
  • Involved in creating dashboard and storytelling using Tableau.
  • Perform code review that included enhancing the mappings and shell scripts for faster data extract and avoid impact to other process, which are hosted in our application.
  • Involved in SIT, UAT, regression and performance testing.
  • Worked with SFTP team and MQ application, since B2B is responsible to transmit files using various protocols like Connect direct, HTTPS, FTPS, SFTP etc.
  • Worked closely in resolving Issues using Session Logs and Workflow Logs.
  • Follow Scrum management methodologies and work with BI and DW administration teams for code migrations, production support, and system monitoring.
  • Used JIRA for the maintenance and tracking of ETL bug fixes and defects.
  • Strong working knowledge on working with ServiceNow incidents, problems and change request. Adherence to all metrics and SLAs.
  • Coordinated with clients, business and offshore team members as part of onshore assignment.

Environment: Informatica PowerCenter 901/961, Informatica Analyst, Informatica Developer 961, PowerExchange, B2B, Address Doctor, Oracle 11g, Teradata 14, Hive, Sqoop, Hue, HDFS, Tableau, Erwin, Unix Shell Script, Control-M, Batch Processing, SOAP Web Serice, IBM MQ, ServiceNow

Confidential

Sr. ETL Lead

Responsibilities:

  • Experienced in analysis, design and development of ETL, Data Mart, Data Cleansing, Data Integration, Data Modelling, Dimension Modelling, Data Loading, Oracle tuning, implementation, Testing, Maintenance of business application.
  • Designed and developed Informatica ETL mappings to read source data and loading into target warehouse for the Credit Risk Data Mart as part of a Finance and Risk restructuring programme.
  • Performed analysis, design and development of new components for the Credit Risk Data Mart as part of a Finance and Risk restructuring programme.
  • Developed mapping document indicating the source tables, columns, data types, transformation required, business rules, target tables, materialized views/snapshots, columns and data types.
  • Implemented SCD I and II tables using Informatica. Created multiple mappings and wrote transformation procedures. Used to load monthly data for datamarts.
  • Designed and managed datamart tables, indexes, partitions for better storage and query from reports.
  • Analyzing complex queries for data load and debugging for any issue by back tracking the tables and ETLs associated.
  • Responsible for ETL Code walkthrough and Knowledge Transfer at all stages.
  • Created unit test plan master document with appropriate test cases and test results for ETL Code.
  • Involved in performance tuning of the mappings / sessions, reducing the loading times of different marts, managing database spaces for huge amount of data, scheduling loading of different marts in production system.
  • Developed reports using Microstrategy BI tool adhering to attributes and metrics standards.
  • Worked with DBA in making enhancements to physical DB schema. Also coordinated with DBA in creating and managing tables, indexes, tablespaces, schemas and privileges.
  • Done data cleanups from feeds, data fixes in production system and responsible for troubleshooting backend production problems.
  • Well versed with exposures, adjustments and BASEL II elements like PD, LGD, EAD and Economic Capital. Responsible for the Economic Capital datamart and CR datamart as well as added heavy changes to the Exposure datamart for this client.

Environment: Informatica Power Centre 9.0, Oracle 11g, Microstrategy 9x, UNIX, HPSA, Control-M, JIRA, Toad, IBM Rational ClearCase

Confidential

Informatica ETL Developer

Responsibilities:

  • Implementing business logic to designing and developing Informatica mappings using several transformations techniques to read source data and loading into target warehouse for the Analytic System.
  • Implemented performance enhancement changes and to make faster data processing through Informatica mappings.
  • Involved in creating data audit mappings & feed management from multiple sources. Data files formatting, manipulation & cleansing using complex transformation logic.
  • Getting the Data from XML sources, used various XML transformation to feed in / out XML data.
  • Data Modelling, process designing for development, Data migration, Data Loading, Data replication, Data fixing, Oracle tuning, supporting the reporting team with enhanced data loads and data modelling, Release management are the core responsibilities.
  • Development of autosys scripts dependencies jobs / meta-data, PL SQL procedure and Shell Script. Modification of stored procedures for performance improvement.
  • Created and maintained Oracle schema objects like Tablespaces, Tables, and Rollback segments, Indexes, Sequences and Synonyms.
  • Developed ETL Design documents, IQA / EQA Documents developed which has been implemented in ETLs and shared with Clients.
  • For the OLTP system, developed technical specifications high level design and detail design documents.
  • Unit testing, Integration testing, Regression testing and getting sign off from business for the transaction system.
  • Data loads in test / QA Environment and solve the bugs / defects found after testing and QA data validation process. In the QA database, the data is also validated by clients / users (UDV Region) and End user.
  • Involved in creation of Environment Document, which provides instructions to implement/Test the Project in QA and Production environments. Managing releases, cutover activities, change management, high Volume Data Migration.
  • Providing production support during Code deployment in production. Data validation in production database after first run.
  • Designed and developed Universes using Business Objects Universe Designer.
  • Designed and created WEBI reports using Web Intelligence Rich Client and InfoView.
  • Created reports involving table/section breaks, multiple data providers and derived tables.
  • Scheduling jobs in Python script and DBMS SCHEDULER.
  • Team Management activities include task allocation, understanding development needs, analysis, running effective meetings, escalation of high priority issues, problem solving, encouraging for long term solutions.

Environment: Inforamtica 8.6.1, Oracle 10g, SQL, PL/SQL, SQL Developer, SSIS, SSRS, Business Objects XI, OBIEE, Python, Java Scripts, Unix, Autosys, Vertica, SVN

Confidential

Trainee

Responsibilities:

  • Worked on requirements gathering from the clients and estimation of efforts including analysis of existing system and analysis of data.
  • Used Informatica Power Center 8.1 and its all features extensively to transform and load to Oracle 10g.
  • Gathered requirements for the scope of loading data from various sources to database.
  • Built Integration between various sources.
  • Extract Data from various sources, flat files to Transform and load into Staging.
  • Designed and developed ETL Mappings using Informatica to extract data from flat files and XML, and to load the data into the target database.
  • Worked with Variables and Parameters in the mappings.
  • Involved in the Development of Informatica mappings and Mapplets and also tuned them for optimum performance, Dependencies and Batch Design.
  • Worked on troubleshooting the Mappings to improving Performance by identifying bottlenecks.
  • Performance tuning of the process at the mapping level, session level, source level, and the target level.
  • Weekly status report to manager on the process and timelines.
  • Used Control-M scheduler to schedule Informatica mappings.

Environment: Informatica Power Center 8, Oracle 9i, Unix, Control-M, HP Quality Control.

We'd love your feedback!