Informatica Lead Developer/ Data Analyst Resume
Tampa, FL
SUMMARY:
- Over 9.5 years of IT experience in design, development, support testing in the field of Data Warehousing Technologies in the Banking, Financial and Telecom Domains
- Liaising with Business users, Project management team and other stakeholders regarding requirement gathering, design, coding and UAT testing and deploying the changes as per the business needs
- Expert in writing superior SQL queries and optimizing the queries in Oracle, SQL Server and Teradata.
- Experience in Developing complex Mappings, Reusable Transformations, Sessions and Workflows using Informatica ETL tool to extract data from various sources and load into target tables.
- Expert in performance tuning of Informatica mappings and also excellent in identifying source, target bottlenecks.
- Experience in Data Warehousing/ETL Architecture designing and Data Modeling (Star & Snowflake Schema and LDM & PDM) for large data warehousing projects.
- Experience in designing of on - line transactional processing (OLTP), operational data store (ODS) and decision support system (DSS) (e.g., Data Warehouse) databases, utilizing Data vault (hub and spoke), dimensional and normalized data designs as appropriate for enterprise-wide solutions
- Experience in Inmon and Kimball data warehouse design and implementation methodologies
- Experience designing Star schema and Snowflake dimensional modeling using Erwin tool.
- Worked on integration and implementation of projects and products, database creations, modeling, calculation of object sizes, table spaces and database sizes
- Experienced in ETL technology conversions and Mainframe tools and technologies
- Expertise in Teradata utilities like Multi Load, Fast Load, Fast Export, BTEQ, TPUMP, TPT and tools like SQL Assistant, Viewpoint
- Experience in generating reports using PDCR and DBC tables for senior management at the bank. These reports are for various requirements like assessing CPU/IO/space usage for different applications, IDs, load processes; determining unused objects, access related reports, application interdependencies etc.
- Well-versed in tuning Teradata ETL queries, remediating excessive statistics, resolving spool space issues, applying compression for space reclamation etc.
- Good understanding of query execution, explain plans, join strategies
- Proficient in Data Analysis, Data Validation, Data Lineage Data Cleansing, Data Verification and identifying data mismatch
- Extensive experience in Data Analysis and ETL Techniques for loading high volumes of data and smooth structural flow of the data.
- Extensive experience on business intelligence (and BI technologies) tools such as OLAP, Data warehousing, reporting and querying tools, Data mining and Spreadsheets.
- Adept at writing Data Mapping Documents, Data Transformation Rules and maintaining Data Dictionary and Interface requirements documents.
- Exposure of end to end SDLC and RUP, Agile methodology
- Handling day-to-day/weekly/monthly report requirements like status update, Defect root cause analysis, Project metrics etc. and ensuring timely delivery of the same
- Skilled in creating & maintaining Functional Specification Documents (FSD), Use Cases, Data field mapping, Process Flow mapping, Requirement Traceability Matrix, performing Impact Assessment and providing Effort estimates, deployment artifacts
- Assisted business in preparing BRD (Business Requirement Document) & CR (Change Request) Document
- Knowledgeable in HP QC, Visio & creating process models/ process diagrams / flow charts
- Have expertise in Data Conversion and Data Migration Projects
- Domain knowledge on Loans, Insurance and Credit Card, Wireline and Wireless telecom data services
- Good communication and presentation skills, works well as a leader, as an integral part of a team, as well as independently, intellectually flexible and adaptive to change
TECHNICAL SKILLS:
Domain and Technology: Banking and Financial Services, Data Warehousing
Operation System: Windows, Mainframes Z-OS, UNIX
Databases: Teradata V2R5/V2R6/V2R12/13.0, 14.10 SQL Server 2005/2008, ORACLE 11g
Database Design Tools: ERwin 9.5.2/7.3/4.1, MS - Visio 2007
Scripting: C, C++, PL/SQL PERL, Shell Scripting
Tools: and Software: Informatica 9.6/9.1/8.6/8.1, Viewpoint, Share Point, Microsoft Technologies, Autosys 11.1/4.5, CA7, SAS, EZTRIVE, REXX, Changeman, Endevor, JCL, COBOL, File-Aid, File-Manager, SAR, XML, Tableau 9.2
DB Utilities: Teradata SQL Assistant, FASTLOAD, MLOAD, BTEQ, XPORT, TPUMP, TPT, Viewpoint
PROFESSIONAL EXPERIENCE:
Confidential, Tampa, FL
Informatica Lead Developer/ Data Analyst
Responsibilities:
- Involved in the enhancement efforts in implementing the business rules for Marketing Profile, Network Evolution, Wire centers call routing applications.
- Designed and developed the ETL process from different source system to transform the data as per the business requirements to be used by the reporting teams
- Expertise in providing technical solution to design, build processes and implement systems and handover the process to production support team
- Worked on designing, data extraction, manipulation and generate the monthly reports for the Executive management
- Provided extensive research on the discrepancies in the revenue counts on the comparative and statistical analysis of the monthly reports
- Performed masking the data in tables/views for the users for testing and reporting purposes
- Experienced using Transformations, creating Informatica Mappings, Mapplets, Sessions, Worklets, Workflows, Processing tasks using Informatica Designer/Workflow Manager to move data from multiple source systems into targets
- Prepared Informatica Mapping Document and also involved in the preparation of the Informatica check list,
- Performed research, requirements gathering, database development, and analytical report creation to meet business needs
- Prepared files and reports by performing ETL, data extraction, and data validation, managed metadata and prepared Data dictionary as per the project requirement
- Delivered ADHOC and data analytic projects to meet business application needs including data validation reports to Business teams as per the requirements
- Worked in tuning the ETL processes to reduce the processing time
- Monitor the ETL process daily and report and data correction before the daily report is sent to the stakeholder and business partner
- Generated the required data for direct marketing teams on daily basis for the implementation of their promotional plans
- Performed code reviews of the scripts delivered by the off-shore and on-site teams to ensure the correctness of functionality and performance.
- Worked with architect and Functional team/users to define and requirements/specs for the enhancements/new deliverables.
- Delivered shell scripts as per the requirements and made modifications to the existing scripts based on the functionality.
- Monitored batch jobs and validated the results to make sure that the functionality is implemented correctly.
- Created mapping documents to aid with the development of more complex PL/SQL reports. Illustrated the data flow in the architecture using Flowcharts (MS Visio).
- Worked on enhancement effort involving various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner and Union to develop robust mappings in the Informatica Designer.
- Worked with Onsite-Offshore to track works, resolve issue, review and migrate code.
- Worked on L2 support, resolved numerous issues based on priority and prepared knowledge base.
Environment: Informatica PowerCenter 9.6, Informatica Data Quality 9.6, TOAD 11.6, Oracle 11g, LINUX, WinSCP, Flat Files, Unix Shell Scripting, Autosys 11/4.5, Teradata 14.
Confidential, Jacksonville
Informatica Lead
Responsibilities:
- Nature of the work involves mainly loading of data from different domains, including Channels, Deposits, ecommerce, Marketing & Loans, as per Service Level Agreements (SLA) and ensure data is available on-time
- Co-ordinate with various business users, stakeholders and SME to get Functional expertise, design and business test scenarios review, UAT participation and validation of financial data
- Developed complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Look up, Expression, Aggregate, Update Strategy, Filter and Router transformations.
- Worked on numerous activities like monitoring Teradata platform using Viewpoint, remediating excessive statistics, tuning production ETL queries, resolving spool space issues, tuning high impact analytical queries, applying compression for space reclamation, Data Mover Development and Support etc.
- Based on business needs prepared conceptual data models for long - term solutions, created logical and physical data models using best practices to ensure high data quality.
- Preparation of data dictionary / business glossaries and also integrating Data dictionary into data models
- Optimized and update logical and physical data models to support new and existing design requirement.
- Responsible for Metadata Management, keeping up to date centralized metadata repositories using Erwin modeling tools
- Used Data Vault technique and achieved many advantages of Data Vault approach some of them are simplified the data ingestion process, removed the cleansing requirement of a star schema and easily allowed for the addition of new data sources without disruption to existing schema
- Data Vault used in both a data loading technique and methodology which accommodates historical data, auditing, and tracking of data.
- Data governance functional and practical implementation and also responsible for designing common Data governance frameworks
- Extensively working on Data Modelling tools Erwin Data Modeler to design the data models.
- Designed ODS, and Data Vault with expertise in Loan and all types of Cards.
- Worked in enhancement of the existing Teradata processes running on the Enterprise Data Warehouse
- Analyzed and generated critical regulatory report that is sent to the government
- Performed many application reloads due to issues from source side, or any delay in upstream application due to migrations/ maintenance activities
- Experienced in writing sort cards, Teradata queries to solve issues in day-to-day activity and recurring issues to avoid manual effort
- Performed performance improvement of the existing Data warehouse applications to increase efficiency of the existing system.
- Worked in multiple issues raised by different users/ consumers of Data Warehouse and aided them in analyzing and modifying user queries to pull the reports
- Heavily worked on SQL query optimization also tuning and reviewing the performance metrics of the queries
- Resolved business and technical queries from business partners related to decision making and reporting on data available in the data warehouse.
- Played a key role planning and execution of the Hardware Upgrade Load recovery activities
- Incorporated the email feature in all the reports to make sure there is no time wasted when the automated process fails due to dependency issues
- Extracting the data using File Manager and tracking data issues using Maximo tool
- Worked on DASD space reclamation effort on 1D Lpar. It has been completed across two phases with the overall space savings being 207 TB accounting to 1.34 M USD.
Environment: Informatica PowerCenter 9.1, Oracle 11g, Teradata 13, IDQ 9.1, SQL* Plus, Toad, Microsoft SQL Server 2008, DB2 V9, Windows 7, UNIX, Sun Solaris, Putty, AIX.
Confidential
Sr. Informatica Develper
Responsibilities:
- Involved in analysis of Business requirement, Design and Development of High level and Low level designs, Unit and Integration testing
- Worked on getting the requirements, developing, testing and support during various conversions like NW (North-West), CA (California) conversions into Model state
- Interacted with clients and users to understand their requirements and provided solutions to meet their requirements.
- Analyzed the system thoroughly and Created System Document of a complex system without any input/document which helped us to get the project from competitors.
- Participated in client discussions to gather scope information and perform analysis of scope information to provide inputs for project scoping documents
- Fine-tuned the existing Informatica mappings for better performances.
- Used UNIX scripting for file formatting before it gets into Informatica.
- Involved in performing impact analysis before any enhancements are made to the existing Informatica code or existing Tables.
- Based on business needs prepared conceptual data models for long - term solutions, created logical and physical data models using best practices to ensure high data quality.
- Preparation of data dictionary / business glossaries and also integrating Data dictionary into data models
- Optimized and update logical and physical data models to support new and existing design requirement.
- Responsible for Metadata Management, keeping up to date centralized metadata repositories using Erwin modeling tools
- Used Data Vault technique and achieved many advantages of Data Vault approach some of them are simplified the data ingestion process, removed the cleansing requirement of a star schema and easily allowed for the addition of new data sources without disruption to existing schema
- Data Vault used in both a data loading technique and methodology which accommodates historical data, auditing, and tracking of data.
- Data governance functional and practical implementation and also responsible for designing common Data governance frameworks
- Extensively working on Data Modelling tools Erwin Data Modeler to design the data models.
- Provided inputs for overall implementation plan, lead deployment of applications/infrastructure and post-production support activities.
- Good domain knowledge in Cards
- Developed complex modules and delivered defect free and highly optimized deliverables
- Trained and groomed new resources on domain as well as technical knowledge and all processes
Environment: Informatica PowerCenter 9.0.1, Oracle 11g, Teradata 12.0, IDQ 9.0, LINUX, MS SQL Server, ERWIN, Autosys 4.5, WinSCP, AIX, XML, Unix Shell Scripting and Windows 7
Confidential
Informatica Developer/ Database Developer/ Data Modeler/ Data Analyst
Responsibilities:
- Understanding the Functional Design Specs and preparing the Technical design
- Involved in designing and developing High level and Low level designs
- Worked Extensively on Informatica tools -Repository Manager, Designer and Server Manager.
- Involved in Extraction, Transformation and Loading (ETL) Process.
- Created the Source and Target Definitions using Informatica Power Center Designer.
- Developed and tested all the backend programs, Informatica mappings and update processes.
- Created and Monitored Batches and Sessions using Informatica Power Center Server.
- Tuned the mappings to increase its efficiency and performance.
- Used Informatica Workflow Manager to create workflows
- Workflow Monitor was used to monitor and run workflows
- Developing Unit Test Plans thoroughly covering all the business scenarios
- As a fresher, performed technical analysis on the existing DALS, IALS COBOL code and created the similar functionality using Teradata BTEQ utilities.
- Worked on coding and testing during acquiring/merging of banks like Merrill Lynch, Countrywide
- Worked on enhancement activities related to GL accounts.
- Involved in unit testing, systems testing, integrated testing and user acceptance testing.
- Prepared documents related to various HELOC, HELOAN, Reverse Mortgage, SEMAX applications.
- Supported the application on the Warranty Period.
Envirnoment: Informatica PowerCenter 8.6.1/8.1, Oracle BI Apps 7.9/6.1, Oracle 11g/10g, Teradata 12/V2R5, BOXI R2, Rally, Agile, Putty, Jeera, HP ALM, Toad, Linux.