Ab Initio Developer Resume
SUMMARY:
- 5+ years of solid experience in the Analysis, Design and Development of Data warehousing solutions and in developing strategies for Extraction, Transformation and Loading (ETL) mechanism using Ab initio, Informatica Cloud, Unix, DB2, HDFS, & Mainframe Datasets.
- Strong knowledge of Data Warehousing concepts and Dimensional modeling like Star Schema and Snowflake Schema.
- Hands - on practical experience with various Ab Initio components such as Join, Rollup, Scan, Reformat, Partition by Key, Round Robin, gather, Merge, De-dup sorted, FTP etc.
- Well versed with various Ab Initio Parallelism techniques and implemented Ab Initio Graphs using Data Parallelism and Multi File System (MFS) techniques.
- Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle and Worked on integrating data from flat files, HDFS & mainframes datasets.
- Got exposed to agile models like SCRUM & Kanban.
- Proficient in using different Job schedulers such as Ab Initio Control Centre, Ab Initio Op console & Control-M
- Designed and deployed well-tuned Ab Initio graphs (Generic and Custom) for Unix environment
- Able to communicate effectively with both technical and non-technical project stakeholders.
- Core java Sun certified in 2013.
- Ability to work independently in a team environment.
- Tracking all activities/tasks using Jira tracking.
- Hands on experience working in ITSM & release activities
TECHNICAL SKILLS:
ETL tools: Ab Initio, Informatica Cloud
Languages: UNIX Shell scripting, Core Java, COBOL
RDBMS: DB2, MYSQL, SQL Server 2008/2008 R2/2012/2015
Big Data Ecosystem: Hadoop, Kafka, Hive, YARN, HBase, NoSQL
Job Schedulers: Ab Initio Control Centre, Ab Initio Op console, Control-M
Operating Systems: OS390 (Mainframe), Windows 7, Linux Ubuntu-10.04
Special Software: Mainframes ISPF, DB Visualizer, File Zilla, putty, FILE AID, ZEKE
Interesting Subjects: Data structures, Operating systems.
Networking: HTTP, SFTP
Version Control: Ab initio Enterprise Meta Environment, Tortoise SVN
Agile Models: SCRUM & Kanban
PROFESSIONAL EXPERIENCE:
Confidential
Ab Initio Developer
Languages & Technologies Used: Ab Initio (GDE, EME, TRMC, Ab Initio Control Centre, Ab Initio Op console), UNIX shell scripting, COBOL, JCL, DB Visualizer, File Zilla, Putty, FILE AID, ZEKE, and Control-M
Responsibilities:
- Design and development of Data Warehouse/ETL process using Ab initio.
- Extensive usage of Multi files system where data is partitioned into four/eight/sixteen/sixty four partitions for parallel processing.
- Wide usage of Lookup Files while getting data from multiple sources and size of the data is limited.
- Developed Generic graphs for data cleansing, data validation and data transformation.
- Implemented various levels of parameter definition like project parameters and graph parameters instead of start and end scripts.
- Responsible for cleansing the data from source systems using Ab Initio components such as Join, De-dup Sorted, De-normalize, Normalize, Reformat, Filter-by-Expression, Rollup.
- Worked with De-partition Components like Concatenate, Gather, Interleave and Merge in order to de-partition and repartition data from Multi files accordingly.
- Worked with Partition Components like Partition-by-key, Partition-by-expression and Partition-by-Round Robin to partition the data from serial file.
- Used phases and checkpoints in the graphs to avoid the deadlocks, improve the performance and recover the graphs from the last successful checkpoint.
- Developed graphs to extract internal/external data needed from different source databases by using multi input file components and by configuring the dbc file in Input Table component
- Involved in System and Integration testing of the project.
- Wrote several Shell scripts, to remove old files and move raw logs to the archives.
- Tuning of Ab Initio graphs for better performance.
- Created sandbox and edited sandbox parameter according to repository.
- Developed parameterized graphs using formal parameters
- Process and Transform delta feeds of customer data, which comes in daily.
- Developed dynamic graphs to load data from data sources into tables and to parse records.
- Drafting runtime documents to provide the insight of the projects from production support perspective.
- Worked on SFTP to transfer the files between the servers.
- Tracking all activities/tasks using Jira tracking.
ETL & Mainframe Developer
Responsibilities:
- Design and development of Data Warehouse/ETL process using Ab initio & Informatica Cloud
- Implemented high speed unloads to deal with large volume of files using API mode unloads & parallel processing techniques.
- Unloading & Uploading data to/from MVS DB2 database
- Reading & Writing distributed and mainframe files along with headers & trailers using conditional DML’s.
- Developed dynamic graphs to load data from data sources into tables and to parse records.
- Developed and Implemented extraction, transformation and loading the data from the legacy systems using Ab Initio.
- Developed adhoc graphs to serve the instant requests from the business.
- Responsible for setting up Repository projects using Ab Initio EME for creating a common development environment that can be used by the team for source code control.
- Optimized scripts to alert source for SLA non-compliance and process correct files in case of multiple file reception by the source.
- Posting Zeke Message events in mainframe from ETL process
- Used sandbox parameters to check in and checkout of graphs from repository Systems.
- Worked with EME / sandbox for version control and did impact analysis for various Ab Initio projects across the organization.
- Migrated scripts from DEV to SIT and UAT environment to test & validate data.
- Fixing production defects within the given SLA as part of production
- Tracking all activities/tasks using Jira tracking
- Implementing best practices in development phase to deliver efficient code.
- Drafting runtime documents to provide the insight of the projects from production support perspective.
ETL Developer
Responsibilities:
- Deployed efficient strategies to comprehend TCPA rules and participated in business requirement discussions and gathered process requirements.
- Developed, tested and implemented production inputs, and designed in/out bound procedures to capture and communicate proper dialer consent.
- Employed File mover to centralize TCPA processes and secure transmission channels across lines of business.
- Provided end to end BI solutions with new adaptive ideas and approaches.
- Developed and Implemented extraction, transformation and loading the data from the legacy systems using Ab Initio.
- Developed adhoc graphs to serve the instant requests from the business.
- Responsible for setting up Repository projects using Ab Initio EME for creating a common development environment that can be used by the team for source code control.
- Optimized scripts to alert source for SLA non-compliance and process correct files in case of multiple file reception by the source.
- Executed successful releases involving automation, and redefined the overall processes to save costs and to make processes compliant with JPMC’s policies and standards.