We provide IT Staff Augmentation Services!

Etl Integration Architect Resume

Torrance, CaliforniA

SUMMARY:

  • Senior ETL Technical Lead Architect with 12+ years of IT experience, with a heavy focus on ETL Informatica Power center tools, Informatica Cloud, Informatica Metadata manager, Data Profiling, Informatica Test data management,, Informatica Data Quality, Informatica infrastructure, Oracle SQL,PL/SQL,Unix Shell Scripting, maintenance of Data Warehousing, Data Governance, Service Oriented Architecture, Enterprise Application Integration, Enterprise Architecture and Operations support.
  • Hands on experience on Big data technologies and have completed proof of concept projects using Apache Spark, Hive, using Python language. Proficient in functional Amazon web services.
  • Great communicator who thrives on building strong rapport and solving real world business problems by aligning IT initiatives with corporate business goals. Proficient in analyzing and translating business requirements to technical requirements and architecture.
  • Hands on experience on Operational Data, Analytical Data, Unstructured Data, Metadata and part of the architecture for an Operational data store at Confidential which is a 20 TB Operational Data Store and worked in 15TB Datawarehouse project for Marathon oil Cooperation.
  • Created a versatile Audit framework for any Informatica related sessions and which can be used as a plug - in to any project. Good knowledge of Error Handling and Recovery strategies in ETL environments and also in Datawarehosuing projects.
  • Possess strong database skills, Object Oriented Programming, and development knowledge with full SDLC of a project, ability to write complex SQL for purposes of analyzing data and evaluating data and translated into back end database structures.
  • Have lead projects on tight deadlines both waterfall and agile and delivered successfully with multiple clients like Confidential and at Marathon.
  • Have driven the best practices and enterprise standards in ETL while guiding the vendor partner for the projects.
  • Expert knowledge of normalized and dimensional modeling techniques using Erwin Data modelling tool.
  • Proficient in translating the business requirements, writing high level design specifications and coding in ETL background and extensively involved in performance tuning, data mapping activities and performed architect role.
  • Mentored on board candidates and have given trainings which involved extensive knowledge sessions about understanding the project, tools used in the project, issue solving and delivering the reports to clients and delivery.
  • Excellent written and verbal communication skills including experience in proposal presentation, delivering high level design documentations and sign offs, possess excellent presentation, interpersonal skills with a strong desire to achieve specified goals.

TECHNICAL SKILLS:

ETL Tools: Informatica 10.1,9.6,9.5,9.1,8.6,8.1.1, 7.1.1

Languages: SQL, PL/SQL, JAVA, C

Database: Oracle 12C/11g/10g/9i/8i, DB2,Microsoft SQL Server, SAP R3

Data Modelling Tools: erwin Data Modeler 9.1.

Scripting: Unix Korn Shell, Perl Scripting

Operating Systems: UNIX,LINUX, Windows 2000/NT/98

Requirement Tools: IBM Rational DOORS.

Scheduling Tools: Redwood Scheduler

Reporting Tools: Crystal Reports XI R2, Crystal Reports 8.x, Congo’s Impromptu Basics, Hyperion, OBIEE, Endeca 7

Testing Tools: HP Quality Center

Ticket management Tools: JIRA, Remedy, Service NOW

PROFESSIONAL EXPERIENCE:

Confidential, Torrance, California

ETL Integration Architect

Responsibilities:

  • The primary roles and responsibilities are as follows:
  • Facilitate Joint Application Development (JAD) / Joint Requirement Planning(JRP) sessions for requirements gathering
  • Interact multiple line of business, SMEs, Data Stewards, Modelers from CPS, Customer central, and VDW groups to gather requirements and document use cases and requirements. Translate business requirements into system requirements to hand off to the development & testing teams.
  • Work in Agile methodology and write stories, execute the business stories and implement with in a sprint cycle.
  • Perform data profiling, data analysis and created data mapping documents/ Source to Target Mapping documents including business rules and requirements to transform data from the staging layer to the dimensional data mart.
  • Facilitate walk through and hand off sessions of the data mapping document, scrub logics to the business and get user acceptance and approvals.
  • Maintaining Data Integrity of the Front end User Interface, URL reports, back end database systems and ensure counts for the applications across the board are consistent and matching for the business owners, have achieved 100% compliance on successful implementation of the project for the same.
  • Have designed and part of active development in ETL using the oracle in extraction transforming and loading the data as part of the projects.
  • Have designed and used extensively Informatica Cloud and sales force integration for Lead management systems. This includes data integrity across the URL reports, the User Interface that business uses and also the IT reports generated and the database as well.
  • Work closely with Data Stewards/Business to identify data anomalies by performing data profiling and analysis of the data quality reports to resolve the data quality/ integrity issues
  • Release management and taking care of the deliverables for the Integration Services modules being worked upon.

Confidential,Torrance, California

ETL Technical Lead/Architect

Responsibilities:

The primary roles and responsibilities are as follows:

  • Create project artifacts such as BRDs, SRSs, Gap analysis documents, RTMs.
  • Facilitate walkthroughs of the project artefacts and obtained sign offs from the stakeholders
  • Perform data profiling, data analysis and created data mapping documents/ Source to Target Mapping documents including business rules and requirements to transform data from the staging layer to the dimensional data mart.
  • Facilitate walk through and hand off sessions of the data mapping document with the data modelers, ETL Team including Developer and Testers.
  • Have designed and part of active development in ETL using the Informatica tool and used extensively transformations like Source qualifier, look up, Joiner, Aggregator, expression, Normalizer, Java transformations in extraction transforming and loading the data as part of the projects.
  • Have done several performance tuning and framed an audit framework, error handling for the team.
  • Extensively loaded and used AWS and Salesforce integration for various projects at Toyota.
  • Extensively used Informatica Data Quality (IDQ) tool for data profiling and data sampling of the customer data exclusively.
  • Used Metadata Manager in Informatica for data lineage and used data profiling techniques for analyzing the ODS data versus the Source for validation purposes.
  • Hands on experience on Informatica Master Data Management and written rules as part of the projects at Confidential .
  • Used Test data management and Data quality tools as well as part of the project.
  • Documented test cases and performed business acceptance testing.
  • Work closely with Data Stewards to identify data anomalies by performing data profiling and analysis of the data quality reports to resolve the data quality/ integrity issues
  • Create multiple extract specifications (both Inbound and Outbound) for receiving data from multiple source systems and sending data from the data mart to the modelers
  • Have been guiding and leading the efforts in Informatica Master Data management POC and also developed and tested the customer related information enterprise level and profiling the same. and taking care of the deliverables for the Integration Services modules being worked upon
  • Release management, change management and been part of support activities for PQSS T3 Application, CPS application, PQSS Endeca Jobs all needs to be closely monitored and on successful completion of jobs produce the status reports on daily basis to end user.
  • Have worked with Peregrine tools, IBM doors for change and release management for the same.

Confidential, Torrance, California

ETL /Data Research specialist

Responsibilities:

  • Played the role of an ETL/Data Research specialist in understanding the business requirements from various business groups such as Endeca COE, TMS internal teams, Third party vendor Flip top and Lexus VOC groups and documenting these requirements
  • Create Data Model and ETL Design and development
  • Perform data profiling, data analysis and created data mapping documents/ Source to Target Mapping documents including business rules and requirements to transform data from the staging layer to data warehouse
  • Work closely with Data Stewards to identify data anomalies by performing data profiling and analysis of the data quality reports to resolve the data quality/ integrity issues
  • Perform Data integration on these several internal and external applications based on customer needs
  • As part of POC, Developed various SQL to perform the feasibility study and later converted into ETL maps
  • Responsible for Data Analysis and data integration across several applications
  • Responsible for generating the flat file which MDEX (Endeca V7) could accommodate and load.
  • Providing the weekly status to Toyota business
  • Performed several demos to various business groups within Toyota on this POC accomplished.

Confidential, Torrance, California

ETL Developer/ Lead

Responsibilities:

  • Responsible for understanding this complex business requirements and took complete ownership on ETL activities from onsite and coordinated with the other team member for the successful delivery of this project.
  • Responsible for Data Model, ETL Design and data load mechanism
  • Responsible for development of the complicated requirement based on data metrics using Model/Model year Vs no of claims reported for this model/Model year for the past n years
  • Performed POC and proved the alternate formulae for standard value calculation which reduced the data storage from 300 million records to 30 million records and thus yield in better performance
  • Coordinate with Business and make them understand the mitigation plan and design aspects of this project.
  • Responsible for reviewing test cases, test validation and test results to ensure that they meet the entry and exit criteria.
  • Have extensively used the Cognizant Proprietary tools such as Platinum - AQUA (Automatic Quality Analyzer for Informatica) for testing in order to report the common bugs, deviation from Standards and performance improvement measures were determined, A tool based automated review process was conceptualized to eliminate defects
  • Involved in Monitoring and Supporting the Jobs from onsite.
  • Received several Rewards and recognition from both Cognizant and Toyota.

Confidential

ETL Developer

Responsibilities:

  • Involved in analysis of the functionality of existing mappings.
  • Code changing for existing mappings and workflows which comes as enhancement or maintenance.
  • Resolved many tickets which came under production support.
  • Unit test case preparation and unit testing.
  • Ensured defect free code has been delivered to production environment.
  • Conducted TAG(technical Advisory Group) Audit to other applications
  • Performed several automations which has yielded ample time and reduced the manual intervention.
  • Developed several continuous improvements to stabilize the system and also resolved all the backlogs

Confidential

ETL Developer

Responsibilities:

  • Received KT from SME and presented reverse KT back to the SME.
  • Given Shadow support for the day to day activities carried out in client’s location during the transition period.
  • Responsible to resolve all the backlog tickets apart from the planned deliverables.
  • Taken a lead phase and took the complete ownership of the all deliverables to the client.
  • Involved in analysis of the functionality of existing mappings and performed code changes for existing mappings and workflows which come as enhancement or maintenance.
  • Resolved peregrine tickets as a part of production support.
  • Also responsible to perform the analysis for various enhancements, perform impact analysis to find out the mappings that could be potentially affected by proposed change(s), development and testing activities.
  • Unit test case preparation and unit testing.

Confidential

ETL Developer/ Team Lead

Responsibility & Achievements:

  • Involved in design and development of POC for this project using BAPI function call in Informatica.
  • Guided the team member by Imparting domain and technical knowledge and successfully produced the deliverables in time.
  • Involved in Analysis and Design phases and in designing the Data Model of the project.
  • Involved in some basic SAP Transactions and worked in generating BAPI function call in Informatica for SAP source table.
  • Developed mappings in Informatica according to functional/application areas to load the data from SAP source systems to Oracle Staging Table and SQL Server Target table.
  • Performed Unit testing and System Integration Testing of the Informatica mappings
  • Responsible for developing the parameterized Crystal reports on the data loaded.
  • Responsible for design and style of the report layout.
  • Provided support during User Acceptance Testing for change requests to in corporate additional functionality
  • Reviewed QA and production Informatica mappings and sessions.
  • Involved in Unit, Integration and System testing of the mappings at offshore
  • Responsible as a SCM coordinator, DP coordinator and other quality related activities.

Hire Now