We provide IT Staff Augmentation Services!

Integration Architect/developer Resume

4.00/5 (Submit Your Rating)

CaliforniA

SUMMARY

  • An accomplished IT professional with 15+ years of experience in Information Management, sharing accountability for a variety of enterprise data integration efforts spanning, modern ‘data lakes’ and traditional Data warehouse (EDW), Data Mart and Data Migration applications at global Banking, Pharmaceutical - Sales and Clinical, Healthcare, Sales and Marketing, Insurance - Auto, Home Owners, Credit Life and Property/Causality organizations. Excellent Inter-personal, Technical, Analytical, and Problem-Solving skills.
  • Highly motivated self-starter accustomed to working in both small and large teams and effectively performs in the lead and mentoring role.
  • Extensive working experience in using Informatica Power Center, Power Exchange with structured and unstructured sources involving relational, XML, COBOL, VSAM, Flat Files and NoSQL systems
  • Extensive experience in SQL server 2008/2012, DB2, Teradata 13v2, Netezza 7.2, Oracle 12x PL/SQL database programming
  • Extensive experience in all phases of Enterprise Data Management and Software Development (Agile/waterfall) Life Cycle management and comprehensive knowledge of Big data/Data Lake Life Cycle Management (Data Ingestion and Acquisition/Data storage/Data processing /Data provisioning)
  • Experience as an ETL/DW Architect, Technical Lead, and Senior ETL developer, Scrum Master, Data Modeler, and Oracle Application DBA with requirements analysis, strong mentoring, decision-making and team playing skills
  • Experienced in logical and physical design of dimensional data Modeling using CA-ERwin and other leading tools
  • Experienced in production Migration/Support/Maintenance using BMC- Ctrl M, Tidal, Tivoli Maestro, CA - Jobtrac
  • Expertise in building Auditing and Balancing process, Change data capture, Error Handling and Metadata capture strategies
  • Experienced in building ETL Load management and Operational management process
  • Skilled in Bigdata/Hadoop related technologies (Cloudera certified), Amazon web services, Business Objects, Cognos, MicroStrategy etc.

TECHNICAL SKILLS

ETL /Data Integration Tools: Informatica PowerCenter 9.0/8.6/8.1, PowerConnect, Power Exchange, Power center BigData Edition, Talend Open Studio

Databases: Oracle 12/11.x, IBM Netezza 7.2, Teradata, Sybase, SQL Server, IBM DB2

Reporting tools: Business Objects XI, Cognos/ReportNet, SSRS, QlikView, Tableau

Scripting Languages: Unix Shell (Korn), Python, Perl

Design Tools: Erwin 4.1/3.5,7.6 (Dimensional Modeling)

Languages: SQL, PL/SQL, C, C++, Java, J2EE, COBOL, XML

Hadoop Technologies: Data Lake/Cloudera/HDFS/Flume/Scoop/NoSql/Hbase/pig/Hive/Impala/MapReduce, Spark, Amazon webservices/EMR

PROFESSIONAL EXPERIENCE

Confidential, California

Integration Architect/Developer

Responsibilities:

  • Work with Mercury and external vendors (LiveRamp/Paragon) team in requirements gathering, analysis, technical architecture, technical specifications, data models and source to target mappings
  • Design and development of Informatica ETL process using Power center 9.1 for change data capture/delta Netezza staging loads from QLC system and Guidewire Policycenter databases and
  • Generation/delivery of data feeds to external vendors.
  • Shared expertise in designing production load management and operational management process including errors/rejects alerts, business users alerts on any new reference data /products/source etc.
  • Work with business users in scheduling and supporting preproduction UAT
  • Built audit, balance and control (ABC) process for balance reconciliation and UNIX shell scripts for ETL daily and history load process, file cleansing and file archiving etc.
  • Coordinating project development/enhancement/production support activities with onsite/offshore developers, DBAs, Tivoli scheduling teams etc.

Confidential, California

Integration Architect/Developer

Responsibilities:

  • Worked with project management team in creating a high level cost benefit analysis (CBA) by evaluating the effort spent on build and maintenance of external data calls and data feeds
  • Leverage the existing master list of data feeds and data calls identified during the assessment phase to create a detailed inventory of the data queries (including any cross server/database queries), tables and columns (entities and attributes) and a ‘Heat map’ that helps to rationalize data call/data feeds and their attributes
  • Creation of a Data Feeds / Data Calls ‘Dependency Matrix’ to map the associated entities and their attributes to source systems
  • Worked with developers/QA team in analyze high priority issues related to Guidewire - eSubmit external data integration and identify remedial measures and evaluating the existing central repository such as ‘EDW fit’ to fulfill the needs of Data Feeds/ Data Calls

Environment: Power Center 9.6.1, IBM Netezza 7.2, oracle 12, SQLserver 2012, DB visualizer 9.2, Tivoli Enterprise scheduler, Hudson deployment, Guidewire (Claimscenter/Policycenter), Micro strategy

Confidential, NJ

Data Migration Specialist

Responsibilities:

  • Requirement analysis, Technical design specifications, Staging/Target layer table design etc.
  • Data profiling, Gap analysis, designing of data extraction, data loading and target file generation process from current metadata platform and loading into semantics manager metadata repository
  • Design and development of Informatica/oracle ETL process to load metadata reference/global tables and dynamic target file generation PL/SQL and UNIX build scripts
  • Technical and business support to team members in creating mapping specification, Requirements Specification/Verification (RSV) documents
  • Work with migration and maintenance activities with Novartis onsite/offshore teams

Environment: Power Center 9.6, Erwin 8.2, Oracle 12c, AIX 5.3, VSS etc.

Confidential, NJ

DW Architect/Developer

Responsibilities:

  • Requirement analysis, Technical specifications design, logical/physical model design, database table design across environments etc.
  • Impact/Gap analysis, profiling and designing the data extraction process from various business units loading into the staging and data warehouse layers
  • Design and development of informatica ETL process to load reference tables, dimensions, facts and aggregate tables
  • Design and development of production load management and operational management process including error/reject alerts, load failure/notification processes, business users alerts on any new reference data /products/source etc.
  • Design and documentation of Requirements Specification/Verification (RSV) and Quality risk Assessment documents in accordance with GMP Regulatory Compliance Procedures
  • Performance tuning of the overall data load through oracle partition and sub partitions, Oracle Hints, Explain Plan etc

Environment: Power Center 9.1.0, Erwin 8.2, Oracle 10g/11i, Business Objects 3.X, AIX 5.3, VSS etc.

Confidential

ETL - Tech Lead

Responsibilities:

  • With Agile process, worked closely with scrum masters, business users, business analysts and data architects in requirements gathering, grooming, documentation and resource/task allocation
  • Data profiling and analysis supporting business analysts and data modelers using Informatica DQ
  • Design and development of informatica ETL process using Power center 8.6/8.1 for change data capture/delta staging loads, loading of dimensions, facts and data marts
  • Shared expertise in designing production load management and operational management process including errors/rejects alerts, business users alerts on any new reference data /products/source etc.
  • Technical and business support to fellow ETL developers, data modelers and BI developers
  • Managing sprint releases with production team and production support on daily/month end loads/reports communicating business users and analytics team
  • Unix shell scripts development to support ETL daily and history load process, file cleansing and file archiving etc
  • Performance tuning of the overall data load through informatica partitioning and oracle partition and Sub partitions, Oracle Hints, Explain Plan and oracle PL/SQL programming etc.

Environment: Power Center 8.6/9.1, Oracle 9i/10g/11i, Informatica IDQ 9.0, Cognos, Sqlserver 2012/2008, AIX 5.3, Tidal - Enterprise Scheduler, Subversion, Version one etc.

Confidential, OH

ETL Lead /Developer

Responsibilities:

  • Requirement analysis and translation of client user requirements into Technical architecture, Technical specifications and Logical data model
  • Analysis, Profiling, cleansing and data extraction from various source systems including SAP R/3, JDEdwards and PeopleSoft.
  • Design and Development of Change Data Capture (CDC) logic for delta loads using informatica ETL process.
  • Developed and Tested Informatica ETL mappings for Flat Files (Direct & indirect load), Oracle, SQL server and Teradata Loads.
  • Developed Mapping Design documents, Source to Target mappings, Unit test scripts, Integration test scripts for the informatica mappings as part of the SDLC.
  • Shared expertise in building Error Handling and Metadata capture strategies.
  • Designing of ETL standards, Practices and Migration process documentations.
  • Developed Oracle PL/SQL Stored procedures used through informatica to derive Production Cycle Time calculations as part of the CPM Data warehouse staging load.
  • Development of Windows/UNIX shells/FTP scripts which checks the availability of the source flat files in the informatica Source folders while running the batches, combining the column headers and table data while generating flat file targets etc.

Environment: Power Center 7.1.3/8.1.3 , Power Exchange CDC 8.1, MS-Visio 2002, Sybase Power Designer, Oracle 9i/10g, Toad 7.6, AIX 5.1, SAP R/3, SQLSERVER 2000, Teradata 7.1-BTEQ, DB2/400, Business Objects XI, Windows NT/2000.

Confidential, WI

ETL Specialist/ Application DBA

Responsibilities:

  • Requirement Analysis and translation of client user requirements into technical architecture decision trees, data models and technical specifications of both Discovery phase 1 and Phase 2
  • Managed detailed work plans and mentored peers in design and development of ETL Architecture
  • Involved in designing logical and physical Erwin data model as per business requirements
  • Analysis, Profiling, cleansing, data mapping and extraction from Mainframe, PeopleSoft, MQ series and DB2 applications to load staging tables using Power exchange and Power connect with Informatica mappings
  • Building, Loading and Testing Informatica Mappings for organization-related dimensions, factless facts along with other regular dimensions used to load premium, claims and certificate regular base facts, accumulating facts and snapshot facts
  • Developed Oracle PL/SQL Stored procedures used through informatica to derive premium and Cert amounts

Environment: Informatica PowerCenter 7.1.3, Power Exchange 7.1.3, PowerConnect MQ series, PeopleSoft, Erwin 4.1, Oracle 9i, SQL server 2000, Toad 7.1, CA - Jobtrac, AIX 5.1

Confidential, NJ

Lead Developer/ Application DBA

Responsibilities:

  • Requirement Analysis and Development of the ER data model for the Adherence Analysis for the Confidential brand of products.
  • Developed the design documents for building staging, Claim Universe, Test Patient Universe and Control Matching process.
  • Developed Informatica ETL mappings for all the processes involved in control Matching.
  • Developed unit testing, integration testing, operational qualification and production qualification scripts.
  • Involved in scheduling and performance tuning of informatica run on demand jobs as necessary in production environment.

Environment: Informatica power center 6.2.2, power center 7.1.2, Erwin 4.1, MS-Visio, Oracle 9i, Toad 7.1, AIX, and Windows NT/2000.

Confidential, NJ

Sr.ETL Developer/ Informatica /Application DBA

Responsibilities:

  • Requirement Analysis, Development of Design specifications and Data modeling -Developed and updated the Erwin data model for the new and existing entities in the staging and mart development area.
  • Application DBA - DDL scripts and other performance tuning scripts in oracle database including exporting tables from development to pre-production environments.
  • Developed and Tested Informatica ETL mappings - Flat files (Direct & Indirect load) including creation of source definitions from the given COBOL definition and Relational tables from DB2 using DB2Connect to staging and staging to mart phases and creation of respective batches, sessions and partitions and other performance tuning. Also involved in the Migration of Informatica folders from Development repository to Production Repository.
  • Development of UNIX shell scripts which checks the availability of the source flat files in the informatica Source folders while running the batches, combining the column headers and table data while generating flat file targets as part of the post session tasks.

Environment: PowerCenter 6.2, Cognos 7.0 Power play, Erwin 4.1, Oracle 9i, Toad 7.2, Maestro (Scheduler), AIX 5.1, Windows NT/2000.

Confidential, NJ 

Data warehouse/Informatica developer/Data Modeler

Responsibilities:

  • Analyses of the Argus Safety source system to provide the change data capture logic.
  • Developed the data model for all the data warehouse life cycle phases involving Source System Interface, data staging, consolidation, customized outbound Distribution etc.
  • Developed the Quality Assurance Framework data model for further reporting of Argus Source System records failed in DEDW Integrity Checks.
  • Developed the Informatica ETL mappings, Oracle stored procedures, Korn shell scripts for all the phases of data warehouse life cycle.
  • Developed IQ scripts, OQ execution scripts and Overview documents under CFR standards.

Environment: Informatica power center 5.1.1/6.2, Erwin 3.52, MS-Visio, Oracle 8.1.7, Toad 7.1, AIX, Windows NT/2000.

We'd love your feedback!