We provide IT Staff Augmentation Services!

Etl Architect Resume

0/5 (Submit Your Rating)

Dallas, TX

SUMMARY

  • Over 10+ years of Information Technology Experience in ETL and Data Warehouse Technologies using Informatica Power Center, Informatica Data Quality, Big Data Management, B2B DT/DX, Informatica cloud, Test Data Management and Informatica Power Exchange 8.6.1/5.2 With Teradata, Netezza and Oracle.
  • Designed and developed mappings using Source Qualifier, Expression, Connected - Lookup and Unconnected-Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner & Rank transformations etc.
  • Experience in integration of Heterogeneous data sources (as well as from homogenous sources) like Oracle, Teradata, SQL Server 2005/2008, Sybase, DB2, COBOL Copybook Mainframe files, VSAM files, XML Files, Flat Files and MS Access into staging area Experienced with Informatica Power Exchange (9.6.1/8.6) using Power exchange listener, power exchange navigator for creating data maps to Load/Retrieve data from mainframe systems and other Enterprise level Data for reduced and cost effective bulk loading mechanism.
  • Tasks involve all phases of theMDMHub implementation process including requirements analysis, system design, programming, unit testing, SIT (systems Integration testing), roll-out, maintenance, and operational support.
  • Experience working in Oracle8i/9i/10g/11g with database objects like triggers, stored procedures, functions, packages, views and indexes.
  • Expertise in working with Teradata V2R5/V2R6, R13 systems and used utilities like Multi-Load, Fast-Load, Fast-Export, BTEQ, TPUMP, SQL Assistant.
  • Worked with Global Temporary tables and Volatile tables in Teradata and created secondary Indexes in joining two tables for fast retrieval of data, to improve performance and there by efficiency.
  • Extensive experience in performance tuning, identified and fixed bottlenecks and tuned the complex Informatica mappings for better performance.
  • Worked exclusively on Transformations in Informatica Data Quality(IDQ) and Master Data management(MDM) like Address Validator Transformation, Association Transformation, Case Converter Transformation, Comparison Transformation, Consolidation Transformation, Key Generator Transformation, Labeler Transformation, Match transformation, Merge Transformation, Parser Transformation, SQL Transformation, Standardizer Transformation, Web Service Consumer Transformation, weighted Average Transformation.
  • Learnt Hadoop Ecosystems Hive, Pig, Sqoop, Impala, Hbase, Flume, Kafka, Storm, OOzie and Zookeeper. Experience in writing HiveQL queries. Created UDFs for performing the data transformations as per the requirements.
  • Expertise in Data Warehousing with variety of Data manipulating and Data Governance skills like Data Migration, Data Modeling, Data Profiling, Data Cleansing, Data masking and Data Validation.
  • Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modeling.
  • Experience in Data Modeling using Erwin Data Modeler (Erwin 4.0/3.5.5) & ER Studio tool for both Physical & logical data modeling.
  • Proficient in using Informatica Developer(Dv) and Data Validation Option(DVO).
  • Experience in working on Agile Methodology, Participated in Scrum meetings, daily stand-ups, reviews and retrospectives, sprint planning etc.
  • Scheduled jobs and automated the workflows using the Autosys Job Scheduler, Control-M Job Scheduler and Tivoli workload Scheduler.
  • Maintain accurate data in project governance tools (HP Quality Center, JIRA, Clear Case & Clear Quest) including project plan estimates, Business Requirements, Design document, Unit/Integration/System Test Plan & Results, Deployment Plan
  • Experience in using Business Objects XI, SSRS and SAP BO in design and development of Business Intelligence reports.
  • Created complex SQL queries with multiple tables and views for generating complex reports.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10.2, Informatica B2B DT/DX, Informatica

Cloud: (IICS/ICRT), Informatica IDQ 9.5, Informatica MDM 9.6, Informatica Big

Data Edition: (BDE/BDM), Data Analyst, Power Exchange 9.6/8.6.

Languages: SQL, PL/SQL, T-SQL, C, C++, UNIX Shell Scripting, Perl ScriptingHTML, Java Scripting, XML.

Databases: Oracle 8i/9i/10g/11g, Teradata15, 14, MS SQL Server 2000/2005/2008 , SybaseFlat Files, XML Files, IBM Mainframe Systems and SFDC (salesforce.com)., AS 400.

Data Modeling: Data ArchitectureErwin 3.x/4.x, Logical Data Modeling, Physical Data Modeling, Relational Data Modeling, ER Diagrams, Dimensional Data Modeling, Star Schema, Snow-FlakeFact and Dimensions Tables.

Query & Utility Tools: TOAD 11.X/10.X/8.x/7.x, SQL * Plus, SQL Loader, Teradata SQL Assistant 7.1BTEQ, Fast Export, Sybase Advantage, Fast Load, Mload, Teradata TPT ScriptsOracle SQL developer, SQL * loader.

Reporting Tool: Business Objects XI R3, SAP Business Objects 4.2, and Cognos reporting 10. SSRS

OS & Packages: MS DOS, UNIX, Linux, Windows NTMS Windows 9 / XP / 7prof / 7Ultimate, MSOffice.

SDLC Methodologies: Agile, Waterfall and Scrum, RAD.

PROFESSIONAL EXPERIENCE

Confidential, Dallas TX

ETL Architect

Responsibilities:

  • Create data modelling documents and source to target mapping documents, and ETL data flows for ETL development.
  • Administer the Informatica Domain, manage deployments and migrations from Dev to QA and to PROD.
  • Install upgrades and work on configuration of application services and domain settings.
  • Worked on Integrating Power Center 10.2 and IICS with Cloud B2b Gateway DX and DT, Cloud Data Quality in running the workflows for transforming the data as per the business requirements.
  • Working on Cloud Data Quality objects like Profiles, score cards, Data Quality transformations, web service calls etc. and migrating objects from power center into cloud data quality.
  • Creating Informatica mappings, sessions, workflows to access data from upstream systems which are in EDI file format and convert them to 844 inbound and outbound formats for downstream systems to access and load them to MC400 systems. create Informatica mappings, MFT in Informatica ICS.
  • Generate reports out of MC400 Health care claims database for users to identify the claims insights.
  • Worked on Creating DT Services to access EDI files for 835 Claims(inbound/outbound) and 837 Claims(Inbound/Outbound).
  • Worked on DX monitor for creating- Profiles, partners, accounts, End points, MFT Connections, Workflows and Applications.
  • Worked on DT Studio in developer tool for creating DT Services using Parsers, mappers, serializers and streamers.
  • Worked EDI 834,835, 837 and 844 stuff using HIPAA Accelerator to process the inbound XML to convert them into Canonical X12 XML files.
  • Work on developing Cloud mappings, MCTs, Task flows, connections and schedules as per requirements.
  • Worked on trouble shooting production issues in 837 Institutional, Professional and Dental claim files.
  • Write scripts to access files from third party systems and stage in internal system, archive and process them in Informatica Power Center 10.2.
  • Conducted POCs on multiple systems Snowflake, Data Bricks, Google Big Query, Aws Lambda, workday, IICS B2B Gateway etc.
  • Use Informatica Data Transformation Studio tool to transform and develop serializers, parsers and mappers for transforming EDI format of files into XML and other formats as per business need.
  • Use Informatica Data Exchange tool to process information and load data into variety of systems by setting up profiles, partners, accounts, workflows for each of the healthcare submitter, partner and hospital systems.
  • Develop Informatica pmcmd commands to process mappings, workflows.
  • Migrate the workflows, mappings, sessions from development environment to QA and production environment and create document list of dependent objects to move to these environments.
  • Access SQL Server, Oracle, DB2 and load data into staging and then transform data as per business requirement into Data Warehouse.
  • Migrated the objects after testing from sandbox to prod environment in IICS. work on different kinds of transformations like Source qualifier, expression, router, joiner, filter, Aggregator, update strategy, sequence generator, lookup etc. to transform data as per business requirements.
  • Develop Informatica connections to connect to source and target systems and load data to the systems as per business requirement.
  • Worked on Microsoft Azure for landing the target data, moving the scripts and saving the script versions using Azure DevOps.
  • We have used AWS for data loading and easily design and deploy the high-volume data integrations.

Confidential, Bridgewater, NJ

Sr. Architect- ETL Informatica Power Center/ Cloud Integration Architect

Responsibilities:

  • Exclusively worked on informatica Big Data (BDM) in moving data from source systems to Data Lake i.e. Google Cloud Platform(GCP).
  • Worked exclusively on Informatica Big Data BDM developer tool Transformations like Address Validator Transformation, Association Transformation, Case Converter, Comparison trf, Consolidation trf, Key Generator trf, Labeler trf, Match transformation and Merge Transformation.
  • Working on creating mappings to fetch the data from Legacy systems and capture the change data from AMS systems using informatica BDM mappings.
  • Using Informatica BDM Hive connectors to load data from Source systems to Hive database for reporting stream.
  • Migrated sales data from legacy SAP systems to co-star cloud system using Informatica power center 10.1, and shell scripts.
  • Received Co-Star (third party cloud system) retail-financial delimited data files using SFTP shell scripts.
  • Processed the data Informatica Power Center 10.1 version and loaded data into SAP systems using SAP/ALE prepare for idoc transformations.
  • Accessed real-time data from third party cloud systems using shell scripts and integrated to SAP using Informatica Power Center 10.1 for reporting solutions.
  • Worked on Informatica Big Data tool moving data from Oracle, SAP sources to Hive and Cassandra databases.
  • Developed and deployed data warehousing and analytics systems using Big Data and Cloud technologies. Worked within an agile development process and familiarized with CI/CD concepts. Collaborated with other architects, technical consultants, and professional server

Confidential -San Jose, CA

Sr. ETL Informatica Cloud Integration Developer

Responsibilities:

  • Implemented Java Code to fetch data from Eloqua marketing cloud to Oracle database and implemented ETL Informatica transformation logic to move data into EDW systems and finally to Sales Data Mart.
  • Implemented multiple fall back scenarios which will look up the same source multiple times as per the requirement.
  • Executed multiple Proof of Concepts on fetching data from cloud systems to integrate data between Cloud systems and on premise databases.
  • Implemented Data Mart reporting solution for data coming from Cloud systems.
  • Implemented Complex business logic with SCD type 2 retention in DataMart.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Worked in Agile Development model using VSTS and participated in scrum meetings in project implementation.
  • Migrated sales data from legacy system to Salesforce using Informatica Cloud integration.
  • Successfully integrated Sales data into Salesforce using transformation logic as per the business requirement.
  • Migrated Integrated Salesforce data from Salesforce Cloud to Oracle and SQL Server for reporting purposes using Informatica Cloud mappings, Power center9.5 and Micro Strategy.
  • Integrated Salesforce data and Callidus Cloud data for reporting on Callidus cloud reporting
  • Worked on integrating True comp commissions and compensation plans calculations from Callidus Cloud and integrated that information into DataMart along with Salesforce.
  • Integrated Workday cloud data using Informatica cloud connector and integrated data into Oracle system and SQL Server for Data Mart reporting.

Confidential

Sr. ETL Developer (Lead)-Business Analytics Analyst Lake Forest, IL

Responsibilities:

  • Performed extraction, transformation and loading of data from RDBMS tables, Flat Files, SQL into Teradata in accordance with requirements and specifications.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Create UNIX shell scripts, ETL mappings, sessions & workflows using Informatica Power Center.
  • Worked in team as a part of Big Data Initiative Grainger is projecting a certain portion of Sales transactional data in Actian Pervasive, which is one of the fastest databases for Analytics.
  • Automation of job processing using Autosys scheduler, establishing automatic email notifications to the concerned persons by creating email tasks in workflow manager
  • Expertise in working with Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Tables, Constraints, Views, Indexes, Sequences.
  • Extensive Experience in Design and Development of ETL Process using Informatica Mappings using various Transformations such as Expressions, Filters, Joiners, aggregators, Lookups (connected and Unconnected), Update strategy, Sequence Generator, Routers and XML to load consistent data into Database.
  • Exclusively worked on creating Technical Design documents by gathering business Requirements from Business Users and developed source to target design documents for development of workflows.
  • Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center 9.1.
  • Worked with Teradata team in loading the data using relational connections and using different utilities to load the data like BTEQ, Fast Load, Multi Load, Fast Export, TPUMP, TPT scripts.
  • Involved in writing SQL Stored procedures and Shell Scripts and Perl Scripts to access data from Oracle, Teradata and Flat files.
  • Worked in Revenue Financial, Sales and billing teams and implemented financial calculations in producing reports with Very accurate results in numbers.
  • Extensive experience in performance tuning, identified and fixed bottlenecks and tuned the complex Informatica mappings for better performance.
  • Worked on SAP BO reporting in preparing Adhoc and canned reports.
  • Worked with Global Temporary tables and Volatile tables and used secondary Indexes in joining two tables for fast retrieval of data, to improve performance and there by efficiency.
  • Implemented different Tasks in workflows which include Session, Command, E-mail, Event-Wait, Event - Raise, Timer etc.
  • Created and Executed workflows and Worklets using Workflow Manager to load the data into the Oracle Database.
  • Strong Experience in implementing ETL Informatica for Efficient data loading from different sources oracle, TERADATA, SQL Server2005/2008, DB2 and flat files.
  • Involved in different phases of Data Warehouse life cycle ranging from project planning, requirements gathering and analysis, offshore coordination, ETL design, ETL development and testing, Reporting, integration, regression and user acceptance testing, implementation, system documentation and support.

Environment: Oracle 11g, Teradata 14, Informatica Power Center 9.1, SAP Business Warehouse, PL SQL, Fast Load, MultiLoad, TPump, TPT Scripts, Autosys, Erwin, Jira.

Confidential, Malvern, Pennsylvania

Sr. ETL Developer

Responsibilities:

  • Expert in of various source transformations, including flat files, XML and relational systems.
  • Proficient in analyzing Data warehouse by building cubes using SQL Server Analysis Services(SSAS).
  • Strong experience in defining referenced relationships and in identifying KPIs in SSAS. Worked exclusively for Asset management team for Financial Investments where Enterprise can work with in the markets.
  • Worked exclusively for Asset management team for Financial Investments where Enterprise can work with in the markets.
  • Responsible for all Informatica MDM hub development/Configuration (match rules, trust framework, IDD, Hierarchy Manager (HM).
  • Designed and implemented stored procedures and triggers for automating tasks in SQL Server 2005/2008, Oracle 10g.
  • Leading and managing enterprise technical Master Data management (MDM) solutions
  • Owning and managing issue / conflict resolution; demonstrated problem solving and decision making skills
  • Worked on SAP BO reporting in preparing Adhoc and canned reports.
  • Strong Experience in implementing ETL Informatica for Efficient data loading from different sources oracle, TERADATA, SQL Server2005/2008, DB2 and Flat files.
  • Applied Transformation logic in various complexities in transforming and transferring the data into downstream Teradata EDW.
  • Implemented scripts Teradata Utilities like Fast LOAD, MultiLoad, TPUMP, Fast EXPORT, TPT Scripts in scheduling the loads to EDW Teradata.
  • In depth experience in Business Intelligence Technologies and Database with Extensive Knowledge in ETL Process and Reporting Services using SQL server 2008/2005, SSIS and SSRS
  • Configured and maintained Report Manager and Report Server for SSRS
  • Strong at developing Custom Reports and different types of Tabular Reports, Matrix Reports, Adhoc reports and distributed reports in multiple formats using SQL Server Reporting Services(SSRS).
  • Providing expertise and architectural guidance for solution delivery
  • Working effectively in a fast-paced environment as part of a high-performing research, delivery and sustainment team
  • Leading requirement analysis, system analysis, design, development, testing, implementation
  • Managing a Tier 1 technical environment including; driving continuous improvement through managed metrics and change management
  • Used Quality Stage to check the data quality of the source system prior to ETL process.
  • Participated in cross-functional efforts to support other teams - such as ETL and database tuning to support SSRS Reporting.
  • Implemented scripts Teradata Utilities like Fast LOAD, MultiLoad, TPUMP, Fast EXPORT, TPT Scripts in scheduling the loads to EDW Teradata.
  • Worked with Global Temporary tables and Volatile tables and used secondary Indexes in joining two tables for fast retrieval of data, to improve performance and there by efficiency.
  • Experience in using Normalizer transformation for normalizing the XML source data.
  • Extensively used XML Parser transformation to generate target XML files.
  • Involved in creation of various Unix Scripts and Perl Scripts which help the ETL scheduling jobs and help in initial validations of various tasks.
  • Worked on PL SQL Scripts and turned the stored Procedures in PL SQL into ETL Informatica Scripts.

Environment: Informatica Power center 9.6, MDM 9.6, Teradata v13, Oracle 10g, TOAD, Oracle SQL developer, Microsoft SQL Server 2005/2008, DB2, HP Quality center, XML, Flat Files, Windows, Accurev, SAP ECC6.0, HP-UNIX, OBIEEE11.1, Autosys.

We'd love your feedback!