We provide IT Staff Augmentation Services!

Sr. Etl Consultant Resume

AtlantA

OBJECTIVE:

Looking for an opportunity to join a fast - growing team in Business Intelligence and Data Integration to utilize cutting edge technologies and expand footprints in the growing field of IT.

SUMMARY:

  • Approximately 10+ years of experience in Analysis, Design and Implementation of Business Intelligence and Data Quality applications with strong experience in Data Warehousing (OLTP & OLAP), Cloud Technologies, AWS and Big Data analytics in Telecom, Banking, Insurance, Healthcare, Manufacturing, Retail, Government clients including Consulting for the fortune 500 and the Big 4 Consulting companies.
  • Design and development of ETL framework to load data from structured, semi structured and unstructured sources like Oracle, SQL Server, Fixed Width and Delimited Flat Files, DB2, COBOL files & XML Files, PDF files, Log files (using Data Processor Transformation, Parser and Serializer).
  • Design, development and optimization of an efficient data pipeline to summarize and aggregate diverse data from/to Data warehouse and Hadoop YARN cluster using Map Reduce programming, NoSQL, Hive QL, Pig Latin scripts, HBase, Sqoop, Flume, Informatica 9.6.1 BDE and other data integration products.
  • Migration and Development in Informatica Cloud Services (ICS) by using Bundles, Integration Templates, Mappings, Task Flows, Saved Queries, Input and Output Macro fields, Mapping Configuration Tasks and Cascading Filters and its integration with cloud reporting tools such as QlikView, Tableau etc.
  • Extensive experience with working on the Inmon(Top-down) and Kimball(Bottom-up) Data Warehouse Design Approach and other DW concepts and methodologies including Logical data model, Conceptual data model, Physical data model(Star Schema, Snowflake schema and Fact constellations) and load and design of Facts(Additive, Semi-Additive, Non-Additive, Factless etc.) and Dimensions(Conformed, Junk, Role-playing, Slowly Changing Dimension).
  • Excellent interpersonal and communication skills, technically competent and result oriented with problem solving skills and ability to work independently and use sound judgment.

TECHNICAL SKILLS:

BI Applications: Informatica Power Center 8.6, 9.1, 9.5, 9.6.1 Big Data Edition,10.2.1, OBIEE (Oracle Business Intelligence Enterprise Edition), Spotfire, QlikView, Cognos 10, Informatica Cloud Services (ICS), Informatica Metadata Manager, Informatica Dynamic Data Masking, Informatica Test Data Management(TDM), Tableau 9.0, Informatica Developer 9.6.1,SQL Server Integration Services (SSIS) 12.0

Languages: C/C++, PL/SQL, Core Java, Perl 5.2 (Strawberry Perl), FORTRAN 77, FORTRAN 90, JIL

Cloud Technologies: Salesforce (SFDC), Microsoft Azure, Informatica Cloud Services (ICS)

Hadoop Ecosystem: Hadoop, MapReduce, HDFS, HBase, Hive, Pig, Flume, Sqoop, Oozie

Databases: Oracle 9i/10g/11g/12c, SQL Server 2005/2008/2012 , MS Access, My SQL, Sybase

Operating System: Windows 8/7/Vista/XP/NT/2000, UNIX, Linux, AIX, HP-UX, Azure

Version Control: Apache Subversion (SVN), GIT, Mercurial, IBM ClearCase

Defect Tracking: JIRA, HP Quality Control (QC), IBM Rational ClearQuest

Scheduling Tools: Autosys R4.5 - R11, Control - M, Informatica Scheduler, Espresso, CA, Informatica Cloud Real Time (ICRT), ActiveBatch 9.0.5,11.0

Data Modeling/Methodologies: Erwin Data Modeler 9.x/8.x/7.x,MS Visio, Agile, Waterfall

EMPLOYMENT EXPERIENCE

Sr. ETL Consultant

Confidential, Atlanta

Responsibilities:

  • Upgraded and enhanced the product used to generate the IRS form 1095 - C (introduced under the Affordable Care Act -ACA healthcare reform) by completing hour tracking for employees, eligibility and affordability determination etc. by consuming the EIN information, HR and Payroll Data, PII and Benefits details for external clients.
  • Extensively used Informatica Developer 10.2 to create an import process to ensure data quality, apply data validation rules and data cleansing, purging EIN data, error reporting and reject reprocessing.
  • Designed ETL processes to make web services calls and REST API integration using Informatica HTTP transformation, downloading files from websites using get methods, SOAP web service integration using Web Services Consumer transformation.
  • Developed Informatica mappings and Data Synchronization tasks to load data to Amazon S3 Server, used client-side encryption using master symmetric key, partitioning to read data from multiple files, distribution column to load data to multiple files and configured the merge partition file option to optimize the mapping performance.
  • Developed mappings to extract data from heterogeneous sources and load into Preload staging tables, used SQL Parameter Files, Mapping Parameters, Mapping variables, Session variables, workflow tasks like email, scheduler, command, events raise, event wait, decision, worklets, assignment, timer etc.
  • Created MCT and mappings to load data into Amazon RedShift, used SSE-S3 and SSE-KMS for client-side data encryption.
  • Used the Unload command to extract data from Amazon Redshift and create staging files on Amazon S3, used copy command to load flat files to RedShift, used vacuum tables to recover disk space and sort data and configured Amazon Redshift Pushdown Optimization - source and full

ETL Consultant (Informatica/Hadoop)

Confidential, Tampa Bay

Responsibilities:

  • Integration of domain data from PeopleSoft, Transaction detail rows, Chargeback dates, Settlement Submission files, unmapped static data (like POS Entry type),Transaction authorization/summary, Sale refund, Pending transfers with the Risk Management System using Power Center.
  • Designed mappings using the Informatica Developer 9.6.1 BDE to push down entire mapping code to HDFS/Hive to run native MapReduce programs including incremental updates in external Hive tables and used Sqoop import and export to transport data between HDFS and Oracle.
  • Impact analysis using Informatica Metadata Manager by extracting data lineage information for Data Structures (source/target/target definition), Fields and multiple power center Repositories.
  • Tuning and monitoring of a large data warehouse by implementing virtual and materialized views, additional indexes, segmentation and partitioning of various fact tables, caching and other performance improvement options.
  • Design and Development of ETL routines, using Informatica Power Center using Lookups, Aggregator, XML Parser and Serialzer, Rank, Mapplets, connected and unconnected stored procedures and Lookups, SQL overrides and data flow management into multiple targets using Routers and target load plan.

Sr. Informatica Cloud Consultant

Confidential, Boston

Responsibilities:

  • Analyzed and transformed the data for Universal Life (UL) product from mainframe PAS (Policy Admin System) containing reinsurance data, funds, base and rider coverages, covered parties, premium and charge schedules, benefits, claims and deposits information into Actuarial Modeling System, AXIS, for policy year basis for pricing analysis and model validation and for valuation supporting current period financial reporting
  • Migration and Development in Informatica Cloud Services (ICS) by using Bundles, Integration Templates, Task Flows, Mapping Configuration Tasks(MCT),Data Synchronization Tasks, Data Replication Task, Power center Task, Saved Queries, Input and Output Macro fields, Cascading Filters, Mapplets etc.
  • Used VSAM COBOL file, copybook, containing OCCURS and REDEFINES and normalized and cleansed data to load it in the ODS (Operational Data Store)
  • Design and create ETL/Informatica framework for Process control, audit balance and control, operational metadata logging and data recovery to capture the ETL runtime metrics and error information
  • Extensively used PERL to create several modules for Canadian and American UL systems which included highly complex calculations using modules, subroutines, scalars, arrays, hashes, and structures.
  • Developed data pipeline using flume, Sqoop and pig to extract the data from weblogs and to do transformations, event joins, filter boot traffic and some pre-aggregations before storing the data in HDFS.
  • Created Dashboards, User Stories, Bar Charts, Line Charts, and Pie Charts etc. in Tableau by joining and blending dimension and measures data between Oracle, Flat Files and Hive by using Table calculations and functions, LOD expressions, quick and context filters to build intuitive visualizations as per the business requirements.

Informatica Consultant

Confidential

Responsibilities:

  • Created mappings, queries and packages to integrate Informatica solution by creating a Client Data Repository (CDR) for Credit Risk with several downstream applications hosted on Oracle, SQL Server, Sybase and Salesforce.
  • Assisted with the Informatica Master Data Management (MDM) version 9.7.1 product to meet CIBC’s CDR business requirements (i.e. data model, business rules, user interface configurations, etc.) and the integration of the metadata with Informatica and Cognos.
  • Design ETL processes to move data from Salesforce data from production to Sandboxes by creating custom objects, using them as external ID’s for certain components and Capturing changes data (CDC) and Flush Interval.
  • Planned and executed an enterprise wide initiative to migrate the scheduling tool, Autosys, from R4.5 to R11. This included upgrading the software system, applying hot fixes, creating virtual machines and job owners, migrating feed hub accounts, installing Autosys clients on windows and UNIX etc.
  • Developed UNIX scripts to perform sanity checks like data file availability, checking header/trailer records, row counts, processing status, keyless SFTP and embedded PMCMD commands for calling Batches and Sessions from the command mode.
  • Automated the process of file exchange between different applications by creating automatic feeds using ETL to transfer files between servers, NAS, feed hub and scheduled the jobs using Autosys Job Information Language (JIL).

Sr. Application Support Analyst

Confidential

Responsibilities:

  • Support and develop new and existing code in the Confidential Datawarehouse environment, administer repositories and provide application support for Confidential Businesses including Wireless, Cable and Technician Settlement.
  • Developed complex mappings and SCD type-I, Type-II and Type III mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup (connected and unconnected), Expression, Aggregate, Update Strategy, Java, Sequence Generator, Joiner, Filter, Rank and Router and SQL transformations. Created complex mapplets for reusable purposes.
  • Integration of various data sources like Oracle, SQL Server, Fixed Width and Delimited Flat Files, DB2, COBOL files & XML Files.
  • Tuned the performance of new and existing mapping/sessions using methodologies such as external loading, increasing checkpoint intervals, dropping index and key constraints, increasing database network packet size, adjusting session data cache/block size, sequence buffer length etc. .
  • Designed mappings for integration TIBCO messaging system and Oracle using the XML generator and XML parser transformation and imported other XML sources as source qualifiers for transformation.
  • Tuning the integrated sql code by generation / interpretation of explain plans, SQL Trace, TKPROF utilities to identify the high cost sql, forcing specific execution plans, updating the database stats etc.

Informatica Data Masking Specialist

Confidential

Responsibilities:

  • Designed and developed a Data Scrambling/Masking solution for the organization, used to encrypt all sensitive customer rmation in non-production environments, as per the SOX security requirements.
  • Responsible for applying performance tuning methodologies on new and existing solutions which involved identifying source, target and mapping bottlenecks, memory optimization, pipeline portioning, collecting advanced performance data, push down optimization etc.
  • Installed and configured Power Center servers which involved installing or upgrading Power Center, applying hot fixes, server and repository migration in Dev, QA and production environments.
  • Application Design, Data Extraction, Data Acquisition, Data Mining, Data Cleansing, Data Matching, Data Conversion Development, Implementations and Testing of Data warehousing and Database business systems.
  • Data Modeling using Data Modelers, Star Schema/Snow flake schema, FACT & Dimensions tables, Physical & logical data modeling
  • Performed Extraction, Transformation, Loading (ETL) of data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, and Workflow Monitor).
  • Responsible and accountable for the sensitivity of all rmation maintained in the various Datawarehouses of the corporation and ensuring PCI Data Security Standards are maintained
  • Worked on several projects using Agile Scrum methodology, participated in the Sprint cycle activities including Analysis, Sprint kick off, daily scrums, sprint retrospective and planning poker sessions.

Data Warehouse Developer

Confidential

Responsibilities:

  • Interacted with Procurement and Operation/Maintenance business representatives for gathering the OBIEE Reports/ Dashboards requirements and to define business and functional specifications.
  • Developed OBIEE Repository (.rpd) - three layers (Physical Layer, Business Model & Presentation Layer), Time Series Objects, Siebel Interactive Dashboards with drill-down capabilities using global & local Filters, Web Catalog Objects (Dashboard, Pages, Folders, and Reports) and scheduling iBots.
  • Data Analysis, Identification of dimensions (type-1, type-2 and type-3), Facts & aggregate tables, Measures and Hierarchies
  • Developed reports/Dashboards such as Sales Consolidation Reports, Incident Management, Problem Management, Change Management, and Contract Management with different analytic views (Pivot table, charts, Column selectors, view selectors).
  • Debugged reports and Dashboards visibility with respect to user’s responsibility and web groups in an integrated environment.

Hire Now