- 14+ years of total IT experience in the Architecting, Requirements/Data Analysis, Design, Development, Testing, Deployment and Production Support of several Data Warehousing and Business Intelligence projects in Manufacturing, Public Safety, Healthcare, Insurance domains.
- 12+ years of strong experience in Data Warehousing and ETL using Informatica Suite of Products, Microsoft SQL Server (SSIS)
- 12+ years hands on experience in working on different DBMS Systems like Oracle, Exadata, Teradata, Netezza DB2, MS SQL Server, MS Access.
- 8 + Years of Dimensional Data Model Experience using Star Schema/Snowflake Modeling, Physical & Logical Data modeling, Forward/Reverse Engineering.
- 5+ Years of Business Intelligence Reporting experience using Microsoft SQL Server (SSRS), Business Objects, MicroStrategy, Power BI, Seagate Crystal Reports, Excel Pivot Table Services
- 4+ Years of Project Manager/Tech Lead experience in Project Planning, Estimates, Status Reporting, Delegating.
- Good understanding and knowledge of various concepts like MDM, Big Data, Hadoop, AWS, Salesforce and other cloud technologies.
Data Warehousing: Informatica PowerCenter, PowerExchange, Developer, BDM, MDM, Microsoft SQL Server SSIS, Teradata, Netezza, Exadata, AWS Redshift
Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Snow - Flake Modeling, Physical and Logical Data Modeling, Embarcadero, ER Studio, Erwin, Oracle Designer
Programming: Unix Scripting, PERL, SQL, PL/SQL, ANSI SQL, Transact SQL, JCL, Python, Java, NZSQL, BTEQ
BI and Reporting: Business Objects, MicroStrategy, SSRS, Crystal Reports, Tableau, Power BI
Databases: Exadata, Oracle, MS SQL Server, DB2 UDB, MS Access, MySQL, Teradata, Redshift, Netezza
Other knowledge: AWS, Salesforce, BigData, HDFS, MapReduce, Agile, SDLC, Waterfall, Spiral
Confidential, Washington, DC
Technical Lead, ETL Architect, Prod Support Analyst, Sr Developer
- Design, Develop and Support various ETL processes in the OLTP and OLAP layers including Operational Data Stores, Canonical and Enterprise Dimensional Models and various other Business Intelligence databases.
- Provide estimates and perform planning activities for new applications in all phases of Software Development Lifecycles.
- Act as an ETL Subject Matter Expert in Claims, Providers, Membership, Enrollment and Revenue data mappings in EDM/ODS/CBIW to Identify areas to implement MDM and Data Quality techniques.
- Develop Informatica Developer mappings to ingest Claims data into HDFS Hive Tables and Parquet files to ingest them into Hadoop Cluster.
- Architect, Design and Develop ETL processes using different tools like Informatica PowerCenter Suite, Exadata, Oracle, Unix, SQL Server, Mainframe JCL, Subversion.
- Work with Data Governance and Stewardship (DGS) team to create Source to Target Mapping and Defect Fix Specification documentation as part of Design, Development and Production Support activities.
- Work in the capacity of Technical Lead and provide work estimates, establish development timelines, manage deployment activities into Test and Production regions and schedule jobs based on dependencies.
- Work in the capacity of Technical Architect and provided guidance on Reusability, Restart-ability, Change Data Capture, automating the process of parameterization across different environments, design audit control process to track the run time statistics and maintain run history of various jobs and perform Performance Tuning of sub-optimal jobs.
- Create and maintain Data Flow Diagrams, High Level/Low Level Design Documents, Production Support Documents for various projects.
- Design Logical and Physical Data Models for the Oracle/SQL Server database and develop Informatica Mappings, SSIS Packages and PL/SQL and T-SQL scripts needed to fix historical data in EDM/CBI sourced from different OLTP systems like NASCO, FACETS, FEP, CVS, Davis Vison used by Health Informatics Team.
- Create Dashboards to get run time information of various jobs running in Production along with run times and load statistics.
- Work with Business users and generate Ad-hoc Reports for auditing and analytical purposes.
- Migrate databases, schemas and re-write Informatica/SSIS mappings to Amazon Redshift to make use of Cloud Technologies for Datawarehousing Projects.
- Integrate different components like Mainframe VSAM datasets, Flat Files Sources, FTP scripts, Informatica workflows and ensure data flows from different legacy systems into CBIW data layers.
- Coordinate Unit, Integration, System, User Acceptance and Performance testing at various stages of the project life cycle using HP ALM.
- Lead, Mentor and Assist new college hires and junior developers as and when needed and provide the needed technical guidance and technical expertise to help them accomplish their tasks.
- Worked as a member of Production Support Team and provide On-Call 24/7 Support to keep the lights on critical processes which are responsible for the delivery of the month end reports.
- Perform Query Optimization techniques on long running Oracle/SQL Server queries using Ex-plan, Partitions, indexes, hints to reduce the run time of long running queries.
- Set up Mainframe JCL jobs in System Test, UAT and Tech Test environments and coordinate with Data Center team in scheduling them as per job dependencies.
- Business requirement Analysis. Gathering the functional specifications from the business users and creating technical specification documents.
- Develop Informatica mappings, sessions, workflows to read from Cobol Data Files and load into Oracle Tables.
- Develop Informatica mappings, UNIX scripts to dynamically create Parameter Files for each individual run.
- Delegate work to other developers in the team ensuring effective and efficient development of Informatica Mappings/Scripts and other project related code.
- Conduct Peer reviews of ETL code to ensure the development of efficient/accurate mappings.
- Fix defects raised post production due to business requirement changes or data model changes.
- Performance tuning of ETL mappings
Environment: Informatica Power Center, PowerExchange, Developer, BDM, MDM, IDQ, MS SQL Server SSIS/SSRS, MicroStrategy, Power BI, Exadata, Oracle, FACETS, SQL Developer, Toad, HP ALM, SVN, GIT, ERWIN, Unix Scripting, Perl Scripting, Mainframe JCL, Citrix VPN, Reflections
Confidential, East Moline, IL
Sr ETL Systems Analyst, Onsite/Offshore Coordinator
- Responsible for gathering the business requirements and create technical specifications for design of the systems.
- Create High Level and Low-Level mapping design documents. Breaking down business specification into technical source to target mapping document.
- Coordinate with Offshore team in India and have them develop, test ETL mappings.
- Developed/Designed mapping, workflows using Informatica Power Center 8.1/7.1. Executed Informatica Workflows, to load data from DB2, Flat Files onto DB2 database Tables.
- Responsible for the day-day activities of the offshore and successfully lead an offshore team of 3 members.
- Communicated with offshore team to transfer knowledge and guide them in different aspects of ScoreCard Projects starting from Requirement Gathering to Production Implementation.
- Involvement in On Call Production Support for different tasks running on the Production Server.
- Responsible for the implementation of John Deere’s standards for Development, and Migration of objects such as mappings, sessions, workflows, shell scripts from development to production environment.
- Performed Impact Analysis on the Database, Informatica ETL events due to new Enhancements on existing Projects, Defects/Bug Fixes in existing code and adding new flavors to the existing Scorecard applications.
- Created mappings, workflows and schedules to extract data from flat files as and when they are available and at the same time copying them into an archive folder.
- Involved in unit testing and integration testing of all the mappings, workflows, schedules developed by the offshore team. Also conducted regression testing when migrating a major change impacting the application to Production.
- Modified mappings as per the changes in the end users’ requirements which are tracked using Mercury Quality Center.
- Involved in Migration of Informatica Repository from 7.1.1. To 8.1.1. Lead the testing team to validate the upgradation of the Informatica Repository.
- Data integrity and data cleansing.
- Actively involved in all phases of the product evolution from a single dimension product to multiple dimensions covering all aspects of Deere and Companies requirements.
- Developed unix shell scripts that automates the process of deducing fixed width flat files, load the data from file into Archive table and on successful completion backing it up in a specified location.
- Developed scripts which process a set of flat files one by one till all the files given by the users are brought from the Mainframe server and processed for that particular day and scheduled it using Unix Cron Jobs.
- Worked extensively with parameter files creating sections for sessions, modifying parameters and variables in the sections as and when needed.
- Developed UNIX scripts/Informatica mappings for generating parameter files dynamically whenever a scheduled job runs.
- Interacted with the production team and resolved issues of identifying dependent processes and scheduling them accordingly.
- Participated in Performance Tuning of ETL maps at Mapping, Session, and Source and Target level.
- Documentation of all the changes that have been made at different level.
Environment: Informatica Power Center 8.1/7.1.4/6.2 , DB2 v18.104.22.1684, Business Objects 8.0, HP-UNIX, MS VISIO, WINSQL, MS Query, Windows XP, Korn Shell Scripting, Perl Scripting.
Confidential, Baton Rouge, LA
Graduate Research Assistant
- Develop, Test, and Implement efficient and maintainable program logic for existing and new applications in accordance with the technical specifications from Research Associates and Professors.
- Developed Informatica mappings/sessions/workflows to gather the data outputted from Sensor applications and loading them into a metrics data warehouse.
- Extensive use of transformations like Aggregator, Joiner, Normalizer, Ranker, Sorter to create output files in a special format aiding research activity.
- Analyze the granularity of data to derive the type of functional logic to implement in order to generate aggregated reports for end user consumption.
- Generate clear and concise documentation as required within the programs and application documentation repository.
- Start/Stop jobs On Demand as and when required.
- Analyze and translate functional specifications into technical specifications including an estimated level of effort for the moderate sized project requests.
- Monitor performance of assigned Research system, resolve operational failures, Investigate reported code defects.
- Guide junior students on their assignments. Mentor student workers on business knowledge, system peculiarities, and complex technical issues.
ETL Systems Analyst Intern
- Performed data analysis of the source data coming from point of sales systems (POS) and legacy systems.
- Developed approach paper for the project after gathering the requirements from business users.
- Developed mappings using Informatica 6.1 designer to extract data from the source databases and flat files into oracle staging area.
- Developed transformation logic to cleanse the source data of inconsistencies during the source to stage loading.
- Implemented the business rules and logic by using Expression, Look up, Sequence Generator, Aggregator, Joiner, Router and Update Strategy transformations.
- Created session tasks, worklets and workflows to execute the mappings.
- Involved in the migration of the project from development to production.
- Involved in monitoring the workflows and in optimizing the load times.
- Extensively used Informatica debugger to validate mappings and to gain troubleshooting information about data