Sr. Informatica Developer/snowflake Developer Resume
- Results - driven IT Professional wif 6+ years of extensive experience in multiple domain areas wif proficiency in SDLC and proven in all aspects of DWBI and Cloud Technologies.
- Performance-driven leader wif interest in everything related to data analytics, data modeling, data quality, data profiling and data visualization and keen to learn and adapt new technologies or methodologies.
- Worked wif processes to transfer/migrate data from AWS S3/Relational database and flat files common staging tables in various formats to meaningful data into Snowflake.
- Excellent Hands on experience on working wif Snowflake Virtual Datawarehouse .
- Expertise in moving On-Premises data to Cloud(AWS S3, MS Azure, Google Cloud Platform).
- Expertise in areas of Data Warehouse and Data Lake concepts, Cloud Data Integration, Data Quality, Data Vault Modeling, Data & ETL Architecture, Data Governance and Strategy.
- Proficient in query optimization and effective database management of different RDBMS Databases (Oracle, Teradata, DB2).
- Experience on setting multi-cluster virtual warehouse, streams, tasks, stored procedure, Time-travel/Failsafe/Cloning.
- Good understanding of data domains and subject areas (me.e Equipment, Customer, Telematics data)
- Strong experience in installation/upgrade of Informatica and configuration of Informatica PowerCenter 10x / 9.6.1
- Hands-on experience in end-to-end implementation of DW BI projects, especially in data warehouse and mart developments
- Hands-on experience of Data integration, Data Quality, and data architecture
- Experience in preparing test scripts and test cases to validate data and maintaining data quality
- Experience wif design and implementation of ETL/ELT framework for complex warehouses/marts.
- Knowledge of large data sets and experience wif performance tuning and troubleshooting
- Working noledge of Informatica Power Exchange to support Data Integration requirements.
- Extensively worked on transformations such as Source Qualifier, Joiner, Filter, Router, Expression, Lookup, Aggregator, Sorter, Normalizer, Update Strategy, Sequence Generator and Stored Procedure transformations.
- Extensively experience in developing Informatica Mappings / Mapplets using various Transformations for Extraction, Transformation and Loading of data from Multiple Sources to Data Warehouse and creating Workflows wif Worklets & Tasks and Scheduling the Workflows.
- Extensively involved in Informatica performance issue and Database performance issue.
- Expertise in error handling, debugging and problem fixing in Informatica.
- Experience working wif SQL Server Integration Services (SSIS)
- Experience in using Oracle 10g/9i/8i, Teradata 13, MS SQL Server 2005, SQL, PL/SQL.
- Automated Deployments and Release Management wif Continuous Integration and Continuous Delivery.
- Progressively elaborate and document user requirements into User Stories, Document of Understanding, Technical and Design documents.
- Strong analytical skills and problem-solving skills to understand the business requirements.
- Experience in using loaders like SQL*LOADER, Fastload, Multiload, TPUMP to load Data from external files Database.
- Extensive experience in writing PL/SQL programming units like Triggers, Procedures, Functions and packages in Unix and windows environment.
- Exposure to on-shore and off-shore support activities
- Excellent verbal and communication skills, TEMPhas clear understanding of business procedures and ability to work as an individual or as a part of a team.
Operating System: UNIX, Linux, Windows Server 2008 R2
Development: PL/SQL, UNIX shell script
Tools: TOAD, Oracle SQL Developer, Visio, Tidal, Autosys
Databases: Oracle 11g/10g/9i/8i, Teradata V2R15, DB2, SQL Server and Snowflake
Reporting Tools: Business Objects, Tableau
ETL Tools: Informatica 10.x/9.x/8.x
Methodologies: Kimball, STAR and snowflake, ER-modeling
Confidential, Peoria, IL
Sr. Informatica Developer/Snowflake Developer
- Responsible for requirement gathering, client meetings, discussing the issues to be resolved and translated the user inputs into ETL design documents.
- Responsible for documenting the requirements, translating the requirements into system solutions, and developing the implementation plan as per the schedule.
- Created ETL mapping document and ETL design templates for the development team to make sure data analysis, data quality, operational management, monitoring, and technical trouble shooting.
- Created external tables on top of S3 data which can be queried using AWS services like Atana.
- Snowflake data engineers will be responsible for architecting and implementing very large-scale data intelligence solutions around Snowflake Data Warehouse.
- A solid experience and understanding of architecting, designing and operationalization of large-scale data and analytics solutions on Snowflake Cloud Data Warehouse is a must.
- Heavily engaged on monitoring pipelines, triage of data quality & performance issues.
- Involved in migration from On prem to Cloud AWS migration.
- Process Location and Segments data from S3 to Snowflake by using Tasks, Streams, Pipes and stored procedures.
- Led a migration project from Teradata to Snowflake warehouse to meet the SLA of customer needs.
- To Build distributed, reliable, and scalable data pipelines to ingest and process data in real-time.
- Created ETL pipelines using Stream Analytics and Data Factory to ingest data from Event Hubs and Topics into SQL Data Warehouse.
- Liaise wif data delivery teams to establish metrics, evaluate production readiness, and resolve complex data and pipeline issues.
- Contributes to policies, procedures, and standards as well as technical requirements.
- Maintain communication across internal teams and technology partners to constantly monitor, collect, review, and roadmap data operations, and related technology solutions.
- Ensure compliance wif IT requirements and standards, including IT controls, audits, disaster recovery, etc.
- Maintain noledge of functional technology of the data platform and related systems.
- Maintain existing data quality scorecard measurements, executing of scorecards, resolving exceptions wif data owners and source systems.
- Create, update, and enhance operational policies, processes and tools.
- Hands on experience in fine tuning AWS data pipelines by resolving the bottlenecks at mappings, sessions, database, Unix/Linux servers and storage level.
Environment: Informatica 9.5/9.6, Power exchange 9.5/9.6.1 Oracle, Teradata, Unix, Tidal & Snowflake.
Confidential, Richmond, VA
- Analysis current state of EDW data and come up wif Design and Integration of salesforce accounts, membership, contract, customer, events and opportunity data into Confidential EDW
- Developed the Leads Datamart integrating the Leads, Events, Opportunities from salesforce and other Engaged activities like Sales, Appraisal, Credit from various Marts in EDW.
- Developed the logic using Customer Interactions to view the customer journey using Customer mart data from Reltio MDM.
- Collaborate wif external auction vendors like Manheim and internal Confidential data analytics team on understanding the auctions data requirements and develop a process to integrate Sales, MMR pricing and online vehicle exchange (OVE) data by using REST API calls.
- Partner wif Repairpal.com and integrated vehicle repair orders and claims data into Confidential EDW environment by using REST API calls and uploading JSON formatted data into Repairpal ESP claims API.
- Analysis and design the data integration of customer reviews from Qualtrics.com web survey for survey questions and answers wif the formatting of horizontal and vertical table structures.
- Collaborate wif Marketing team on understanding requirements and integrate Confidential customer email notifications data for number of online clicks
- Write/analyze functional requirements based on obtained business requirements and designed solutions to aligned wif existing EDW target state.
- Implement data integration best practices and conventions, code review, design testing strategy and elaborate test data sets.
- Conversion of existing data marts stored procs from Teradata to Snowflake.
- Created BTEQ scripts for Teradata and used utilities Fast Load, Multiload, TPT updates.
- Optimizing queries for performance and tunned tables in Snowflake.
- Used Snowflake Tasks to schedule procedures and Streams to load data real time.
- Designing and implementing validation stored procedures for data loads
- Extraction of data from source like flat files, XML, JSON, SQL Server, Sybase and Teradata databases.
- Develop ICS mappings and mapping configuration tasks to integrate salesforce data into EDW system
- Created Informatica mappings, sessions, workflows, and reusable components to process data from XML files and relational data.
Environment: Informatica (IICS), AWS S3, Informatica PowerCenter 9.6.1, REST API, Webservices, Teradata 15.0, SQL Server Service Now. Git and GitHub.
Confidential, Rockville, MD
- Analyzed source and target data warehouses and data mart data model for migration
- Responsible for requirement gathering, user meetings, discussing the issues to be resolved and translated the user inputs into ETL design documents.
- Responsible for documenting the requirements, translating the requirements into system solutions and developing the implementation plan as per the schedule.
- Providing the impact analysis for the new requirements, created Source to Target Mapping Specification Document.
- Involved in complete SDLC wif data modeling - form logical, physical and conceptual data model. Extracted data from various heterogeneous sources like DB2, Oracle, SQL Server, Flat files and COBOL files.
- Worked wif Power Center tools like Designer, Workflow Manager, Task Developer, Workflow Monitor and Repository Manager.
- Extensively used various active and passive transformations.
- Defined the process to follow for that other project team members.
- Completed documentation of database structures and processer used to move data in and out of the database for future development and explanation of database
- Responsible for Unit Testing of mappings and workflows.
- Worked on Slowly Changing Dimensions Mappings of Type me, II and III.
- Involved in performance tuning for source, target, mappings, sessions and server.
Environment: Informatica Power Center 9.1/8.6, Oracle, Windows NT and UNIX.
Confidential, Irving, TX
- Analysis, Design and document the ETL solution and mapping documents for processing of the Data Integration into Data Marts Encounters, Party, Patient.
- Design, implemented end to end interfaces for Wellcentive cloud analytics for vendor like eCW, Atana, BCBS.
- Development of Informatica mapping, workflows, and reusable components for maplets, worklets.
- Developed and implement the Data Quality rules for data standardization, enriching.
- Worked wif un-structed data like HL7 - ADT, BAR, EDI 835, 837 and CMS-CCLF
- Worked for Populations health, Clinical strategies, Patient Satisfaction Subject Areas.
- Performance tuning of mapping and SQL queries used in stored procedures wif explain plan and other pre and post SQL, SQL overrides.
- Implemented ABCE Framework for Exception Handling strategies to capture errors during loading processes, to notify the exception records to the source team and automating the processes for loading the failed records to warehouse and missing files using Python.
- Performed Informatica administration activities. Upgraded Informatica from 9.5 to 10.1 and implemented security groups for LDAP in all the env Dev, TEST, PROD.
- Working wif large tables wif 60 million rows and DWH of size 20 TB.
- Extensively worked Tidal job creation and scheduling and adding calendars according to the client requirement
- Extraction of data from sources like SQL server, DB2, Oracle, MQ (real time from EMR), XML and JSON.
- Write/analyze functional requirements based on obtained business requirements and design ETL/DQ solutions aligned wif informational architecture and data governance.
- POC for Data Lake using cloudera Hadoop (6 node) eco system and informatica Big data edition (BDM).
- Implement data integration best practices and conventions, code review, Design testing strategy and elaborate test data sets.
- Created BTEQ scripts for Teradata used utilities FastLoad, MultiLoad.
- PHI Data masking from PORD to lower environments using Informatica TDM services.
- Involved in MicroStrategy Report and Dashboard Development and cube refresh.
Environment: Informatica 10.1, Power Exchange 10.1, TDM, IDQ 10.0, Teradata V2R15, Linux, SQL Assistant, TD Studio, MicroStrategy, ERWIN McKesson, Atana, ServiceNow, Diplomat -MFT
Confidential, Atlanta, GA
- Performed full SDLC life cycle in analysis, design, development, testing, UAT, implementation and post implementation support activities.
- Documented the Component Test and Assembly Test results in common Share Point.
- Used Informatica PowerCenter 7.1/6.0/5.1 to load extract data from Flat Files, Oracle, and Sybase databases and load to Sybase, Oracle, Teradata database and Flat Files.
- Performance tuned mappings and sessions to achieve best possible performance.
- Created tasks, worklets and workflows and scheduled workflows to run the jobs at required frequency using Workflow Manager.
- Involved in Deployment Activities and Hypercare activities.
- Extracted data from multiple data sources, performed multiple complex transformations and loaded data into SQL Server Tables.
Environment: Informatica Power Center 7.1, Oracle 8.0/7.x, Windows NT 4.0
- .NET Developers/Architects Resumes
- Java Developers/Architects Resumes
- Informatica Developers/Architects Resumes
- Business Analyst (BA) Resumes
- Quality Assurance (QA) Resumes
- Network and Systems Administrators Resumes
- Help Desk and Support specialists Resumes
- Oracle Developers Resumes
- SAP Resumes
- Web Developer Resumes
- Datawarehousing, ETL, Informatica Resumes
- Business Intelligence, Business Object Resumes
- MainFrame Resumes
- Network Admin Resumes
- Oracle Resumes
- ORACLE DBA Resumes
- Other Resumes
- Peoplesoft Resumes
- Project Manager Resumes
- Quality Assurance Resumes
- Recruiter Resumes
- SAS Resumes
- Sharepoint Resumes
- SQL Developers Resumes
- Technical Writers Resumes
- WebSphere Resumes
- Hot Resumes