We provide IT Staff Augmentation Services!

Sr. Informatica Developer And Administrator Resume

Roseville, CA

Summary

  • Senior Software Engineer with 10+ years experience surrounding the complete lifecycle of Data Warehouse, Business Intelligence (BI) and Integration Projects that includes specialized vertical markets like Healthcare, Utilities, Banking, Technology and other business sectors. Ability to multi - task, prioritize and foresee project expectations. Highly motivated, innovative, quick learner with excellent analytical & organizational skills.
  • Involved in all aspects of data warehouse development. Gathering requirements and designing ETL processes. Developing and automating ETL processes. Error handling and audit control framework. Ability to read logical and physical data models and understands the significance of Natural Keys/Surrogate Keys. Capable of writing complex SQL queries and UNIX Shell scripts.
  • 9 years of hands on experience in Informatica. Creation of mappings, sessions, worklets, and workflows. Informatica to Teradata using TPT.
  • Upgraded all environments from Informatica 9.X by installing and configuring Informatica 10.2 on all environments. Deployed code from lower environments to production.
  • Administrated users and user profiles.
  • Coordinated with the DBA team for customized data refreshes on lower environments from Production DB, patch installations, testing all critical ETL processes to check to see whether there was any performance degradations/failures, performance optimizations, partition maintenance and compression.
  • Used Data Quality package with the existing rules for standardizing, validating and cleansing raw source data.
  • Creation of Ab Initio Graphs.
  • Creation of UC4 objects including Jobs, Job Plans, Events, and schedules
  • Creation of Autosys boxes, jobs, maintaining dependencies.
  • Creation of folders and check-in/out of code including mappings and database scripts
  • Good understanding of HDFS and Map Reduce model.
  • Used Symmetric DS to replicate data into AWS cloud.
  • Sourced data from Salesforce using Heroku connect and staged into PostgreSQL.
  • Then replicated data from PostgreSQL, using Symmetric DS, into AWS cloud.
  • Oracle Sql loader utility
  • SQL, UNIX shell, Perl, C++, Object Oriented Languages

EMPLOYMENT HISTORY:

Confidential, Roseville, CA

Sr. Informatica Developer and Administrator

Environment: Informatica 9.6, Informatica data quality, Powerexchange, Oracle 11g, UNIX shell

Responsibilities:

  • Maintain business intelligence infrastructure by making sure that the reporting environment is functioning well. We used Hyperion BI 9 and Essbase cubes. I was responsible for recycling Hyperion services running on windows. This included: Hyperion Workspace web application, Hyperion Financial reporting services, Hyperion DRM, and Hyperion print services.
  • Coordinated with system admins for UNIX server patching for the solaris servers and their respective virtual machines/zones.
  • Informatica to Pentaho conversion: Converted Informatica code into Kettle transformations and scheduled newly created Pentaho transformations using Autosys.

Confidential

Software Engineer Team Lead

Environment: Sql Server 2008, SSIS, Informatica 9.5, power shell, SVN

Responsibilities:

  • As a senior Informatica developer for Kaiser s Pharmacy DW, I developed a UNIX batch framework that assisted in cutting down the batch runtime.
  • The execution of Informatica workflows that shared a common target was parallelized. The need of such execution was critical since multiple pharmacies spread across different time zones are to be processed based on the local server time.
  • Created technical spec documentation and developed ETL solutions for various user stories as part of the agile methodology.
  • Knowledge gained on Oracle golden gate replication that replicated pharmacy store specific data onto Oracle exadata PERC server.
  • Replication was accomplished making use Golden Gate data pump process in conjunction with the trail files.

Confidential, San Ramon, CA

Environment: Informatica 9.1.0, Oracle 11g, Oracle OBIE, UNIX

Senior data engineer

Responsibilities:

  • Responsible for the design, development, and enhancement of Finance Data Mart.
  • The development of the Confidential is one of the most important strategic initiatives for Confidential ’s Finance Technology Dept.
  • Confidential houses and defines General Ledger and Instrument level data both from an EDW and finance perspective.
  • My responsibility was to do gap analysis between existing disparate source systems and determine if Confidential can be utilized as the source to Risk Data Analytics team for the government mandated CCAR reports.

Confidential, San Ramon, CA

Sr. Programmer/Analyst

Environment: Informatica 9.1.0, Business Objects XI, Oracle 11g,Teradata 14.0, PVCS, UC4 job scheduler, UNIX

Responsibilities:

  • Took ownership of complex legacy code.
  • Gathered requirements from the business and provided an overall estimate.
  • Conducted design and code reviews.
  • Ensured that the code is built as per the specifications.
  • Worked laboriously to accommodate last minute changes as requested by the business.
  • Coordinated with the DBA team for customized data refreshes on lower environments from Production DB, patch installations, testing all critical ETL processes to check to see whether there was any performance degradations/failures, performance optimizations, partition maintenance and compression.
  • Studied the source system and application in alignment with the business goal.
  • Took ownership of the design and development of the SMOC Data Mart Analyzed the requirements and provided design specifications. A complex project that required an in - depth knowledge of database, table sizes and writing efficient code.
  • Designed and developed code in accordance with the BICoe standards.
  • Created testing documents and supported the QA during integration testing and UAT.
  • Received excellent recognition for the work performed from the director of Smart Meter Operations.
  • Ensured that the Field automation Services related ETL ran without any incidents.
  • Designed and developed a solution to provide Confidential & Confidential Gas Operations a detailed report containing geographic coordinates of field technicians outside a specific distance from a Gas Leak Field Order.
  • The process gave the business user an ad hoc ability to retrieve the GPS data for a given Technician by date. Efficient ETL design was developed keeping in mind the volume of data acquired from the GPS reads (every 10 minute interval).The reports were available through the standard Confidential & Confidential enterprise Business Objects repository.

Confidential, San Francisco, CA

Sr. Programmer

Environment: Informatica 8.x, UNIX, Oracle 10g, Mercury Quality Center, Facets, IDE and IDQ, CDMA - Code Data Mapping Application, ABC - Audit Balance and Control Metadata repository.

Responsibilities:

  • Integration of health care provider and membership data from different sources into Rosetta
  • Operational Data Store and to create a single data repository with all the business information about providers and members.
  • Performed multiple roles simultaneously doing project development work, managing production support team, data quality and tuning code.
  • Performed code reviews of ETL code developed by offshore developers. Demonstrated a high level of technical proficiency in designing and deploying Informatica Powercenter and Power exchange code.
  • Used Address standardization modules and other modules in IDQ and clearly communicated the results of profiling to Sr. Data Analysts and SME's
  • Following were some of the major accomplishments
  • Successfully Implemented ABC (audit, balance and control) framework: A parameterized Audit, Balancing & Control infrastructure for supporting ETL and data warehouse architecture with the intent to provide high data quality.
  • The ETL process to load data from different source systems to EDW was driven by the ABC tables and set of perl scripts.
  • The ABC process controlled the execution of ETL jobs, passed in the variables to informatica mappings and sql scripts at runtime, provided restart & recovery capabilities when there was an issue with the ETL process, and improved ETL process by providing statistics about job execution and allowing optimal job sequencing.
  • Took a lead role in designing and developing ABC framework using Informatica, perl, and ksh scripts.
  • Successfully designed, developed, and implemented an Informatica solution to provide users critically needed input files that in turn created the eligibility files that went to a Diagnostics center.

Confidential, Phoenix, AZ

Sr. Informatica Developer

Environment: Informatica 7.1.4, UNIX, DB2

Responsibilities:

  • primarily involved with the creation of complex Informatica mappings, worklets, and workflows.
  • Scheduled sessions and workflows on the Informatica server using Informatica Workflow manager.
  • Used decision tasks, event wait and event raise tasks, and control tasks. Supported existing Perl and K - shell scripts that were used to support data management.
  • Analyzed, developed, tested and documented ETL code.
  • Versioning and creating deployment groups for migration.
  • Production support

Confidential, Chicago, IL

ETL Consultant

Environment: Informatica 6.x, 7.x, Ab Initio, UNIX, Oracle, Mercury Quality Center, TOAD, sql developer, Squirrel for DB2, Aqua Data, Eclipse for Java, PVCS Version manager.

Responsibilities:

  • Created ETL mappings using Informatica to move data from multiple sources like flat files, cobol copy books, relational into a common staging area and eventually to data warehouse and data marts. Developed complex mappings using transformations (Dynamic lookups, update strategy, router, normalizer, union)
  • Implemented SCD 2 for dimensions.
  • Used Workflow Manager for creating and running sessions concurrently by using decision tasks, event wait and event raise tasks.
  • Translated the business rules into Informatica mappings for building the data mart. Designed, developed and unit - tested mappings and used transformations-Lookup, Join, Aggregator, MappletRouter, Rank, Filter, Source Qualifier and External Procedures using Source Analyzer, Warehouse Designer, Mapplet Designer, Mapping Designer and Transformation developer.
  • Created reusable transformations and Mapplets to use in multiple mappings.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Troubleshoot problems by checking sessions and error logs.
  • Scheduled sessions and batches on the Informatica server using Informatica Workflow manager.
  • Created Tasks, Workflows, Worklets to move the data at specific intervals on demand using Workflow Manager and Workflow Monitor.
  • Diligently researched and gained in-depth knowledge of ABC. Created ABC metadata to schedule jobs and maintain dependencies.
  • Assisted in code migration activities between different environments. Performed data loads in three different environments (DEV, SIT, and UAT) and monitored data loads effectively.
  • Created and supported existing Perl and UNIX scripts used to perform/support data management.
  • Created documentation for production support. Documents contained specific steps to start, monitor, and debug load runs in production environment.

Hire Now