We provide IT Staff Augmentation Services!

Etl Lead Developer & Onsite Offshore Coordinator Resume

4.00/5 (Submit Your Rating)

Hartford, CT

SUMMARY:

  • Eight(8+) plus years of IT experience in Data warehousing and Business intelligence with emphasis on Business Requirements Analysis, Application Design, Development, testing, implementation and maintenance of client/server Data Warehouse and Data Mart systems
  • Expertise in development and design of ETL methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica Power Center 9X, Informatica Power Mart 6.2/6.1/5.1.2/5.1.1/4.7, Power Connect, Power Plug, Power Analyzer, Power Exchange, ETL, OLAP, ROLAP, MOLAP, OLTP, Autosys
  • Worked on Data Cleansing, Data Profiling, Data Standardization using IDQ (Informatica Developer and Informatica Analyst) and Expertise on Data Quality Analysis, Address Doctor and working with Parser, Exception, Consolidation and various match strategies.
  • Expertise in automation of several manual reports using Web focus to produce a quality and timely deliverable and worked on Report Caster and MRE.
  • Worked on Informatica TDM - ILM workbench for Data Masking using different Masking Techniques for Masking teh sensitive data from different systems like Vantage(VSAM),HIS,EZ app,DB2.
  • Experience using Business Intelligence tools like Business Objects XI R2R 1/6.5/6.0/5.1/5.0, Web focus, Hyperion, Brio 8/6.5. Created Universes designed and maintained various reports using Business Objects and Brio.
  • Experience using Data modeling tools. Knowledge in designing and developing Data marts, Data warehouse using multi-dimensional Models such as Snow Flake Schema and Star Schema while implementing Decision Support Systems. Experience in FACT & Dimensions tables, Physical & logical data modeling.
  • Extensive experience using Oracle 11g/10g/9i/8i/, MS SQL Server 2008/2005,DB2, PL/SQL, SQL *Plus, SQL *Loader and Developer 2000. Hands on Working noledge in Oracle and PL/SQL Writing Stored Procedures, Packages, Functions and Triggers.
  • Extensively worked with Informatica power center tools to extract data from various sources including Oracle, Flat files, XML and COBOL Files.
  • Experience in Performance tuning of sources, targets, mappings and sessions and Queries. Worked with Oracle Stored Procedures and experienced in loading data into Data Warehouse/Data Marts using Informatica, SQL *Loader. Extensive Expertise with error handling and Batch processing
  • Strong noledge of Software Development Life Cycle (SDLC) with industry standard methodologies like Waterfall, Agile, and Scrum methodology including Requirement analysis, Design, Development, Testing, Support and Implementation.
  • Experience with TOAD, SQL Developer and Data Studio tools to test, modify and analyze teh data, create indexes, and compare data from different schemas. Experienced in UNIX Shell scripting and PL/SQL procedures and Share point.
  • Worked in teh Scheduling tools like Autosys (creating JIL scripts), Report Caster and Control - M and testing tools like Quality Center.
  • Worked in 24/7 production support of ETL and BI applications for large insurance & Banking data warehouses for monitoring, troubleshooting, resolving issues, generating SLA reports. Also worked on administrative tasks of Informatica.

TECHNICAL SKILLS:

Data warehousing Tools: Informatica 9X, Web focus 7X, IDQ,ILM - TDM

Technology: UNIX, Oracle 9i/10g/11g, Sql server 2008,DB2, Java, JSP, JavaScript, Hyperion, Brio, HTML

Tools: Autosys, Control -M BMC remedy, Quality center, DMS, Clear case, Toad, Data Studio

WORK EXPERIENCE:

Confidential

Responsibilities:

  • Designing teh Technical specification document, component specification document.
  • Actively participating in teh requirements gathering.
  • Analyzing teh FRDs and preparing teh GAP Analysis, query logs and updating teh checklists.
  • Preparing teh TDDs and Source - Target Mapping Extracts (Mapping Specs).
  • Actively involved in business discussions and designing teh data model.
  • Extensively worked on developing Informatica mappings using different transformations, sessions and workflows.
  • Debugging optimizing teh mappings using Power Center Designer and used innovative logics to provide maximum efficiency and expertise in Performance Tuning.
  • Understanding teh FSDs and designing/developing teh Informatica code as per teh requirements.
  • Establishes design/coding guidelines and best practices.
  • Involving in Development /Design /offshore coordination.
  • Involved in Unit testing / Framework testing/ Integration testing.
  • Performance tuning of teh mappings.
  • Migration from Oracle to DB2.

Confidential

Responsibilities:

  • Was leading a team of 6 members on offshore and 2 member in Onsite.
  • Created a front end using .net to dynamically upload teh files to load and mask teh data.
  • Created cross data base to store teh metadata of teh masked entities.
  • Involved in teh design, development, unit testing, system testing, and implementation and support for full and incremental refresh.
  • Customizing of TDM rules/policies for data masking.

Confidential, Hartford, CT

ETL Lead Developer & Onsite offshore coordinator

Responsibilities:

  • Was leading a team of 4 member team for WINS stream in onsite and 6 member team in offshore.
  • Clear understanding of source systems thoroughly by going through teh functional specification documents and one-to-one interaction with teh business team.
  • Developed data mapping documents that contain transformation rules to implement teh business logic.
  • Developed various mappings with teh collection of all sources, targets, and transformations using designer. Used version mapping to update teh slowly changing dimensions to keep full history to teh target database.
  • Used Informatica Dynamic Data Masking techniques, including masking, scrambling, hashing, randomizing, blocking and hiding, to mask crucial fields of forwarded data. Among many others, techniques used are non-deterministic randomization (which replaces each field with randomly chosen data), blurring (modifies teh original data within a specific range), repeatable masking, and substitution.
  • Used ILM Data Validation Option to compare masked values against original data to guarantee that all sensitive values are different, and to confirm that all substituted values come from a data dictionary.
  • Used transformations like aggregator, filter, router, stored procedure, sequence generator, lookup, expression and update strategy to meet business logic in teh mappings.
  • Used MD5to handle teh Slowly Changing Dimensions (SCD) mappings and data from teh source and load into Data Mart by following type II process.
  • Used update strategy to effectively migrate slowly changing data from source system to target Database.
  • Involved in Data Profiling using Data Explorer and IDQ.
  • Understand teh components of a data quality plan. Make informed choices between sources data cleansing and target data cleansing.
  • Designed data transformation to staging, fact and dimension tables in teh warehouse.
  • Extensively worked with various lookup caches like Static Cache, Dynamic Cache and Persistent Cache.
  • Used power center workflow manager for session management, database connection management and scheduling of jobs to be run.
  • Tested teh data and data integrity among various sources and targets. Used debugger by making use of breakpoints to monitor data movement, identified and fixed teh bugs.
  • Create and maintain batch scripts for pre/post session operations and various day-to-day operations. Used Autosys job scheduler to schedule teh Informatica jobs along with other operational processes.
  • Worked with PMCMD to interact with Informatica Server from command mode and execute teh batch scripts.
  • Made substantial contributions in simplifying teh development and maintenance of ETL by creating shortcuts, re-usable Mapplets and Transformation objects.
  • Developed stored procedures for tasks involving source pre load and target post load running informatica jobs.
  • Extensively used Informatica to extract, transform data from different source systems and load teh data into teh Target database.
  • Developed Re-usable transformations, Mapplets to use them for data load to data warehouse and database.
  • Scheduled and ran Extraction, Loading processes using Autosys
  • Developed unit test case scenarios for thoroughly testing ETL processes and shared them with testing team.
  • Developed teh strategies like CDC (Change Data Capture), Batch processing, Auditing, Recovery Strategy etc.
  • Code promotion to SIT and UAT
  • Performance tuned a workflow which was ruining for more than 9 hrs to 1 hr 20 mins and got appreciation from Clients and Mangers.
  • Excellent onsite and offshore co-ordination along with code walkthrough and Business Knowledge Sharing.

Confidential, Hartford, CT

ETL Lead developer

Responsibilities:

  • Was leading of a team 4 members in Onsite and 6 members in Offshore.
  • Reverse engineering of existing Insurance data warehouse environment, end-to-end, for teh Genius load environment (one of teh source system of IDW) using teh Informatica, SQL Server and Autosys technologies.
  • Verification of effort and infrastructure required to build an end-to-end environment.
  • Worked with an agile team who needs repeatable processes and reliable environments for creating and delivering working software.
  • Participated in Project Planning sessions
  • Assist in development of deliverables and quality reviews
  • Assist with issues that need resolution from teh XL side
  • Provide direction for day-to-day activities of offshore team
  • Validate that a limited data set from upstream can properly flow downstream
  • Document enough detail to reproduce with manual effort and limited subject-matter expertise.
  • Document lessons-learned and issues encountered
  • Developed recommendations an options toward implementing parallel development which would include a low-cost recommendation (no apex/capex impact - what can we do today?) and a long-term vision (which may require investment in hardware and/or tools)
  • Successfully completed project within schedule and migrated teh code to Test with minimal defects
  • Provided detailed Technical specification documentation and ST mapping document of configuration in a way that it can be executed repeatedly and consistently.

Confidential, Hartford, CT

Team Lead

Responsibilities:

  • Impact analysis of teh modifications required in teh existing architecture.
  • Leading a three member team in Offshore.Identifying teh objects and streams which require changes.
  • Involved in teh data model changes and brought new ideas in teh Data model design.
  • Creation of new tables and adding new columns to teh tables.
  • Development of new mappings, session and workflows and enhancement of teh existing mappings.
  • Effort estimation for teh new coding and develop teh mappings to extract teh data on incremental load basis.
  • Assign teh jobs to team members
  • Worked with SQL Override in teh Source Qualifier and Lookup transformation.
  • Extensively worked with both Connected and Unconnected Lookup Transformations.
  • Creating Indexes on tables for most retrieved fields of tables and views.
  • Extensively worked with various lookup caches like Static Cache, Dynamic Cache and Persistent Cache
  • Review teh coding and test results
  • Unit and QA testing
  • Worked in performance tuning of Data marts project with Hartford understand teh root cause of teh issue and recommended tuning options for teh inherent performance improvement alternatives
  • Root cause analysis of teh jobs running slow and current SLAs are slipped off of due to performance issues.
  • Creating Indexes on tables for most retrieved fields of tables and views
  • Created and scheduled a stores procedure to identify unused indexes and disable them for teh better performance in ETL load and reduce teh load time.
  • Optimized teh mappings by changing teh logic to reduce run time.
  • Responsible for Performance Tuning at teh Mapping Level and Session level.
  • 100% of quality deliverables with Zero defects.
  • Transition of business noledge to teh new joiners.
  • Improved Data warehouse performance for all teh data mart jobs and brought SLAs back to teh original state.
  • Fixed all reporting related issues in record time. Obtained excellent feedback from teh business users.

Confidential

ETL Developer

  • Design and develop teh ETL maps & UNIX scripts for loading teh data mart tables. Unit testing of teh ETL maps/Unix scripts.
  • Support teh data load activities on ETL on other applications.
  • Analyze teh business related changes that need to be accommodated in teh maps.
  • Validate teh data loaded into teh data mart tables against teh source system.
  • Analyze teh data quality issues during load.
  • Designed, developed and debugged ETL Mappings using Informatica designer tool.
  • Created different target definitions using warehouse designer of Informatica Power center.
  • Designed mapping as per ETL Specification, developed mappings, optimized them for better performance.
  • Perform Debugging and Tuning of mapping and sessions and Scheduling of sessions.
  • Identified and tracked teh slowly changing dimensions.
  • Developed Pre and Post SQL scripts, PL/SQL stored procedures and functions.
  • Used Debugger to check teh errors in mapping.
  • Generated UNIX shell scripts for automating daily load processes.
  • Documented teh entire process. Teh documents include mapping document, unit testing document and system testing document among others
  • Generate dashboards for teh business customers to view teh reports.
  • Provide Adhoc reports to teh customers.
  • Reports will be delivered in teh Pdf, excel,word and HTML.
  • Got prompted to next level within teh team with 6 months.
  • Automation of all teh manual reports.
  • Automated all teh jobs through Autosys and error handling implementation.

Confidential

ETL Developer

Responsibilities:

  • Design and develop teh ETL maps & UNIX scripts for loading teh data mart tables. Unit testing of teh ETL maps/Unix scripts.
  • Design and develop teh front end for teh users selection criteria.
  • Validation checks on each check and drop box.
  • Population of teh data from thedb tables in teh drop down box.
  • Generation of teh GUI reports for teh business users through teh Webfocus coding.
  • Code review and unit testing.
  • Code migration using teh Clear case (SCM) tool.
  • Support teh data load activities on ETL on other applications.
  • Automation of several reports to make teh qulity deliverable me ntime without he human intervention after teh data load.
  • Analyze teh business related changes that need to be accommodated in teh maps.
  • Validate teh data loaded into teh data mart tables against teh source system.
  • Analyze teh data quality issues during load.
  • Designed, developed and debugged ETL Mappings using Informatica designer tool.
  • Created different target definitions using warehouse designer of Informatica Power center.
  • Designed mapping as per ETL Specification, developed mappings, optimized them for better performance.
  • Perform Debugging and Tuning of mapping and sessions and Scheduling of sessions.
  • Identified and tracked teh slowly changing dimensions.
  • Developed Pre and Post SQL scripts, PL/SQL stored procedures and functions.
  • Used Debugger to check teh errors in mapping.
  • Generated UNIX shell scripts for automating daily load processes.
  • Documented teh entire process. Teh documents include mapping document, unit testing document and system testing document among others
  • Associate score card- Teh report generation is a generic one, so that even addition of columns to teh existing database will not need any code changes in teh reporting.Got teh Best Project within e-Analytics for Associate score card.
  • Got teh Associate of teh month twice in a year.
  • Got teh Extra mile for completing a project in a short span of time.
  • Appreciation from teh business customers for Automation of all business reports.
  • Got prompted to next level within teh team

We'd love your feedback!