Informatica/odi Developer Resume
Dearborn, MI
SUMMARY:
- IT Professional with 12 plus years of leadership experience in application development, analysis, testing and training
- Strong understanding of Data Warehouse, star schema, FACT, Dimensions, slowly Changing Dimensions, data transformation, data mapping, data cleansing, data profiling
- Over 7 years of experience with Informatica PowerCenter 8.4, 9.0, 9.2.3, developing complex transformations, mappings, Mapplets and workflow
- Hands - on experience with Shell scripting, job scheduling, AutoSys, performance turning
- Highly proficient in SQL (Oracle, SQL Server, Netezza, Greenplum, Teradata, DB2), PL/SQL, developing procedures, user-defined functions, packages, synonyms, complex analytical functions
- Strong understanding of data validation, meta-data, data governance, master data management, data migration, data archival, data-cleansing, data extraction, data transformation, data loading
- Excellent understanding of banking and financial sector: consumer banking, investment banking, mortgage, collateral debt obligation, collateralized mortgage obligation, value-at-risk, securitization, accounting, FAS 140, FAS 133, G-fee, options, derivative accounting, repo, rev-repo, option pricing.
- Good conceptual understanding of SOX, COBIT, COSO, CAVR, Six Sigma, data governance,
- Advanced skills in SQL, UNIX, PL/SQL, VBA, PERL, SAS NoetixViews, Noetix Analytics
- Solid understanding of data normalization, first normal form, second normal form, third normal form, data mart, data warehouse, star schema, facts and dimension, object-oriented programming, source-to-target mapping, data quality, meta-data, data retention and archival
- Advanced skills in reporting applications like Cognos, Business Objects, SAS
TECHNICAL SKILLS:
Frontend Tools: Cognos, Business Objects, SAS, MicroStrategy, Tableau
ETL: Informatica PowerCenter, SSIS
Programming Languages: PL/SQL, JAVA, PERL, VBA, VB, UNIX, SSIS
Databases: MS Access, Oracle 11g, Teradata, DB2, Sybase, SharePoint
Web Technologies: HTML, DHTML, ASP, CSS, XML, Tomcat, WebLogic
Testing Tools: Rational Test Suite, Load Runner, Quick Test Pro
Bug Reporting Tools: DOORS, Rational Clear Quest, Quality Center
ERP: SAP, PeopleSoft, Oracle Financials
EXPERIENCE:
Confidential, Dearborn, MI
Informatica/ODI Developer
Environment: SQL Server, Oracle, Informatica, UNIX, Autosys, Teradata
Responsibilities:
- Develop data requirement/data mapping document for migrating the Transfer Simplification (TRANSIM) project.
- Load SOR data from the landing area to the raw table/enrichment table/export table and target table using Informatica.
- Liaise with application business owners to understand the business requirements and document them.
- Develop database objects like user-defined functions, user-defined procedures using T-SQL scripts.
- Work with downstream users of AML application systems and responds to their data requirements.
- Develop key documents such as High-level design document, Low-level design document,
- Work with business owners to troubleshoot issues regarding Hyperion reports and OBIEE reports. accelerated change document, workflow diagrams, Use Cases, Data flow diagrams.
- Key participant in project planning, estimation, solution development, testing and delivery
- Provide analytical support to different stakeholders, analyze issues/enhancements
INFORMATICA/ODI DEVELOPER
Responsibilities:
- Work with the project manager and technology architects to gather data requirements for migrating data from heterogeneous source systems to landing area in the SQL Server.
- Develop T-SQL stored procedures, user-defined functions, and other SQL Server database objects to encapsulate business logic.
- Develop high-performance T-SQL queries, complex joins and advanced indexing techniques to optimize database operations.
- Work with business owners to troubleshoot issues regarding Hyperion reports and OBIEE reports.
- Develop complex Informatica mappings, sessions and workflows in line with business rules to migrate data from the source systems to the data warehouse.
- Develop T-SQL stored procedures, user-defined functions, packages and other Oracle objects to migrate the data from source systems to the data warehouse.
- Generate datasets by writing SQL queries that provide insights for the project stakeholders in designing the new application.
- Work with legacy application support teams to understand existing data to answer data questions for next-generation application development
- Developed complex built-in transformations like lookup, joiner, xml and web services
- Developed workflows to expedite large-volume data movements by implementing push-down optimization, partitioning in Informatica
- Develop UNIX shell scripts to schedule Informatica workflows.
- Analyze large datasets using Python/PERL.
INFORMATICA/ODI DEVELOPER
ENVIRONMENT: INFORMATICA, ODI, ORACLE, hYPERION, obiee
Responsibilities:
- Develop advanced database objects, stored procedures, user-defined functions to represent business logic in technical terms.
- Develop high-performance queries, complex joins and advanced indexing techniques to optimize database operations.
- Develop new process to extract data from existing systems and add additional data to support Group Voluntary and Work Life Benefits (GVWB) broker marketing.
- Create Informatica mappings, sessions and workflows for ETL process to get the data into the enterprise data warehouse (EDW)
- Developed complex built-in transformations like lookup, joiner, xml and web services
- Developed workflows to expedite large-volume data movements by implementing push-down optimization, partitioning in Informatica
- Developed re-usable transformations like Mapplets to simplify and expedite Informatica development.
- Perform Salesforce.com data validation for the monthly reports using Tableau.
INFORMATICA/ODI DEVELOPER
ENVIRONMENT: INFORMATICA, ODI, ORACLE, hYPERION, obiee
Responsibilities:
- Work with the business SME and business analysts in developing database objects, stored procedures, user-defined functions, packages to capture business logic.
- Develop high-performance queries, database-optimized SQL Constructs to improve database performance.
- Identify performance bottlenecks and refactor database objects to improve application response times.
- Develop ad-hoc and business reports using Tableau.
- Lead the design, development, testing and documenting of all aspects of ETL for migrating the data from multiple source systems into staging area
- Develop Informatica mappings, workflows to migrate data from the source systems to the data warehouse.
- Develop PL/SQL objects - stored procedures, functions, packages to migrate the data from source systems to data warehouse.
- Key participant in project management meeting. Influence decisions on solution development by carefully evaluating proposals and proposing alternative solutions
- Aid in the development of Informatica mappings and workflows to implement ETL solution for data movement to staging
- Build relationship with senior management team, project partners and other stake-holders through sustained high-quality deliveries, team-player attitude and can-do attitude.
T-SQL Developer
Responsibilities:
- Develop complex SQL queries, optimize existing SQL objects to improve application performance.
- Work with Business to understand requirements and convert them into technical specifications.
- Develop Stored procedures, user-defined functions, indexes, views, and packages.
- Analyze large volume of data using UNIX/Perl and Python.
- Develop SSRS Reports based modeled on legacy reports with enhancements.
- Work with development team in communicating business requirements, evaluate and recommend technical solution and testing.
- Develop key documents such as High-level design document, Low-level design document, accelerated change document, workflow diagrams, Use Cases, Data flow diagrams.
- Key participant in project planning, estimation, solution development, testing and delivery
- Provide analytical support to different stakeholders, analyze issues/enhancements
- Evaluate technical solutions and influence the design, coding, testing and delivery of technical solutions.
- Work with the development team and subject matter experts in documenting workflow logic while migrating FPS from .Net 1.0 to .Net 4.0.
- Develop a test plan, test strategy and test scripts for component integration testing phase.
- Investigate the root cause of data variance between actual and expected result and communicate them to the development team.
Informatica Power Center Developer
Responsibilities:
- Develop complex T-SQL queries to validate that segmentation for cost centers has been accurately captured for first lien, Held-for-Sale secondary mortgages.
- Develop Informatica mappings, sessions and workflows to migrate data from the sources systems to the landing layer and to the data warehouse.
- Develop complex re-usable transformations like Mapplets, complex transformations like joiner, aggregator, XML and web services to implement business logic.
- Analyze large volumes of data using UNIX/Perl and Python programs.
- Develop complex SQL queries to validate that the data archival process picks up accurately the paid, charge-off and void customers for the data archival process.
- Work collaboratively with the development team in analyzing the legacy application table structures/data elements for developing a robust, scalable code for high-volume data migration.
- Perform root-cause analysis of data variance between expected and actual results for the data migration/archival process.
Developer
Responsibilities:
- Develop complex T-SQL queries to validate that segmentation for cost centers has been accurately captured for first lien, Held-for-Sale secondary mortgages.
- Develop Informatica mappings, sessions and workflows to migrate data from the sources systems to the landing layer and to the data warehouse.
- Develop complex re-usable transformations like Mapplets, complex transformations like joiner, aggregator, XML and web services to implement business logic
- Analyze large volumes of data using UNIX/PERL and PYTHON programs
- Investigate reasons for exceptions between actual and expected results and escalate findings to the development team
- Performed data migration by loading data from several ODS systems to target systems using SSIS.
- Follow-up on bug-fixes and execute regression testing after code-changes
- Develop Program Change Request (PCR) based on High-Level Design (HLD) and Low-Level Design (LLD) documents, and inputs from key project stakeholders
- Worked collaboratively with team lead in developing high-level design documents and associated project artifacts for TDR-NCL and Clear Optimization Projects
- Work collaboratively with the Development team in clarifying Program Change Requests.
- Respond to ad-hoc requests for mortgage-related data analysis from business and technical heads.
- Gather requirements for validating data and develop SQL-based test scripts for data validation.
- Test mortgage-related data (HFS and HFI loans) to verify that segmentation has been applied correctly and that journal entries are correctly being booked to the new cost centers.
- Perform root-cause analysis of data variance between expected and actual results GLSR project.
etl Developer/Report Developer
Responsibilities:
- Worked closely with AML business group to generate reports by consolidating data residing in different databases (Teradata & Oracle)
- Generate reports in SAS/Business Objects by consolidating data residing in different databases (Teradata, SQL Server, and Oracle) to aid auditors to verify compliance with regulations.
- Consolidate data residing in multiple databases/platforms using SAS to generate actionable business intelligence.
- Migrate data from multiple source systems to development/test environment using SSIS.
- Work collaboratively with application development team to support software/application testing in conformity with Confidential ’s Distributed Data Environment (DDE) standards.
- Develop test cases/test scripts and other test artifacts as per Confidential ’s DDE standards based on business requirement document, technical design document to support Integration testing.
- Validate data movement from source-to-target; identify probable causes for data variance.
- Identify software defects and perform regression testing to validate defect resolution.
- Validate data transformation during data movement from source-to-target.
- Prepare appropriate data for loading into test environment for application testing.
- Identify performance variance from actual and expected results and escalate issues to development team for software bug fixes.
- Perform regression testing and provide sign-off on test results for code movement to production.
- Analyze business requirements and evaluate the software testing methodology for different components of CCB integration.
Informatica Developer
Responsibilities:
- Developed requirement documents for source-to-target mapping of column fields for the relevant tables
- Performed data profiling, data cleansing, and data validation using SAS
- Performed data validation and data variance analysis using SAS
- Developed test strategy, data preparation and data cleansing and data loading
- Developed test plan, test cases and test scripts for UAT testing
- Worked collaboratively with SIT in automating SIT test cases, analyzing and defect remediation
- Performed root-cause analysis of data variance in data quality report generated by SAS
- Automated UAT test cases using VBA
- Analyzed data variance in the TDQ exception report during UAT phase
- Worked closely with the SAS Development team and the ETL development team in fixing defects
- Performed manual testing of non-automated test scripts as well as regression testing once defects were fixed
Informatica Developer
Responsibilities:
- Developed source-to-target documentation for mapping legacy data with data mart layer
- Participated in data normalization exercises
- Designed Fact, Dimension and Reject tables
- Analyze the data model and suggest changes to the model to support the presentation layer in Cognos.
- Performed root-cause analysis of data variance between source system and data mart
- Developed data quality rules based on inputs from business leaders
- Recommended strategies for improving the data quality to avoid bad data in data mart
- Prepared test data for application testing and loaded it to the Oracle database
- Developed testing strategy for new business intelligence application
- Developed test cases and test scripts using SQL for back-end testing
- Prepared test cases using UNIX shell scripting for regression testing.
- Developed automated test cases using VB Scripts for parameterized testing in QTP
- Filed defects in Rational Clear Quest and assigned them severity levels
Business Objects Report Developer
Responsibilities:
- Developed functional specification document for business reports on Confidential analytics
- Interviewed SMEs to develop the validation and transformation rules for ETL process
- Prepared the high-level test strategy, test plan and test scripts for SIT/UAT testing.
- Developed test cases for regression testing in UNIX using Shell Scripting
- Prepared test data for application testing and loaded it to the Oracle database
- Analyzed Mortgage-backed securities to establish the correct basis for Interest only and principal only cash flows in line with US GAAP FAS 91 guidelines
- Generated business reports using Business Objects, Hyperion and Cognos
- Analyzed MRB establish the correct basis for cash flow amortization as per FAS 140.
- Analyzed Confidential ’s Real Estate Mortgage Investment Conduit portfolio to establish the correct basis for amortization of cash in line with FAS 144 US GAAP guidelines
- Mapped the cash flow data from the Restatement Financial Data Warehouse (RFDW) into the Securities Cost Basis Sub-Ledger (SCBSL) and the general ledger
- Performed application testing in SIT phase, UAT and pre-production environment
- Performed regression testing in UNIX after bug fixes and after every new code release