- Senior data integration analyst with 9+ years of experience in the data warehouse industry, analyze, design, development, test and implementation of business work flows and parallel data processing solutions using Ab Initio, SQL, java, DB engines and schedulers.
- Knowledgeable Ab Initio specialist skilled in extraction, cleansing, dimensional data modeling, transformation, wrangling, profiling source data, integrating various sources like mainframe, SAS files, XML, JSON, HDFS & data migration by leveraging various Ab Initio components like rollup, join, lookup, ICFF, scan and many more.
- Strong experience in optimizing & tuning SQL queries, indexing, table partitioning, writing and debugging stored procedure. Efficient in parallel data extraction & loading with MPP database,
- Strong UNIX shell scripting skills with hands on experience of scheduling and automate Ab Initio jobs through TIVOLI job scheduler, Autosys tool and crontab.
- Well experienced in creating ETL design document, test plans, test cases and execution of test scripts for the data warehouse solution.
- Good experience of EME data store, air commands and version control like GITHUB, SVN.
- Experience in scrum/agile model, managing quality processes and coordinate go - live production releases.
- Strong in supporting, executing, validating and monitoring ETL production jobs.
- Proficient in communication, interpersonal skill, good analytical and problem solving skill
- 3 years of project lead experience to co-ordinate requirements, establish work priorities, organize assignments and mentoring junior team members.
- Knowledgeable in big data technologies like Hadoop hive, Sqoop and NO-SQL database.
Data Integraion & BI tools: Ab Initio products (GDE, Co-Op System), Jaspersoft BI, Tableau, Hadoop ecosystem
Programming: SQL, Shell Scripting, Java, Python, Hive
DATABASE: Oracle 11i, Teradata 14, IBM DB2 9
File FormAT: SAS, Mainframe EBCDIC, AVRO, HDFS, Multi-partition File, XML, JSON
BUSINESS Domain: Insurance (Auto, Property & Casualty), Finance (Credit cards, Student Loan)
Lead ab initio consultant
- Profile, Analyze and evaluate Auto & Property policy data and prepare data plate form using Ab Initio for actuary. Leverage enterprise data warehouse and different tools to develop corporate research data files, information platforms or data spaces.
- Evaluate technical designs and data mapping documents, utilize design to develop accurate defect free code, test programs using Ab-Initio, SQL, Shell Scripting, Java, python, oracle, SAS on UNIX environment.
- Execute, monitor and validate data of key policy, quotes and claim application, provide on-site support and perform root-cause analysis & fix any issues that arise in data sets.
- Participate in Production Support transition meetings
- Participate in code reviews, and test plan reviews
- Provides input in creating detailed ETL development estimates or change requests.
- Perform performance tuning on both UNIX ETL and Oracle database environments
- Work closely with Data scientist, Actuary and business managers to prepare requested data sets. Prepare use case document to evaluate different best suitable ETL tools.
- Technical Environments: Ab Initio GDE 3.2.2, Co-Operating 3.2.4, Linux, Oracle 12i, Python, Java, SAS, Tivoli work scheduler and Hadoop ecosystem
Ab initio consultant
- Provided subject matter expertise in the analysis and preparation of specifications and plans for development /modification of the EDW.
- Designed PULSE data marts and support business analytic using the Teradata database. Proactively identify and communicated scope, design, development issues and recommend potential solutions.
- Developed ETL code to meet all technical specifications and business requirements according to the established designs, assisted & mentor other team members.
- Coordinated work with offshore team, business analyst and data analyst to resolve gaps in EDW design specifications.
- Coordinated with data modeler/DBA to design data model & integrate DB changes.
- Created, scheduled new jobs and supported existing production job runs
- Keep ETL artifacts in EME and deploy it to production using Live Queue process.
- TE: Ab Initio GDE 3.1. 7, Co-Operating 3.1. 4, Linux, Teradata, Tivoli work scheduler and Cognos reporting
Ab initio consultant
- Migrated PERL drivers into Ab Initio Conduct>IT plan.
- Analyzed and redesigned existing long running EDW process and improved run performance. Loaded high volume sales fact and dimension tables into Netezza.
- TE: Ab Initio Co>OP 3.0.4, Linux, Oracle 11g, Netezza DW appliance and Business Objects
- Discussed requirements with business analyst, Data architects and QA and prepare technical specifications.
- Designed, developed and tested Ab Initio graphs, Java batch and web application.
- Implemented various partitioning techniques (like data, Component and Pipeline), re-partitioning technique and multi-partition file and flow to speed up the processing of data wherever applicable.
- Wrote shell script to invoke graph in scheduler.
- Maintained issue log, prioritize, resolve critical immediately and coordinate remaining issues with the offshore team member, resolve doubts related to requirements, design and guide them on technical issues.
- Installed and configured A>I products, create EME projects in each of the environments, setup sandboxes in physical environment.
- Actively involved in code reviewing and testing Ab Initio graphs, java program and shell script for its correctness.
- Created and deploy Ab Initio tag & Java packages on test environment, support production release
- TE: Ab Initio GDE 3.0.4, CO>OP 3.0.4, HP Unix, IBM DB2 9, SQL and PL-SQL, Inetsoft reporting server, Autosys scheduler, Excel, MS VISIO, Java and Tomcat web server
- Developed, test & deploy changes made in Java based web application & business flow. Support & monitor jobs in the production environment & resolve issues. Analyzed data, wrote SQL to generate report, created report & dashboard using Jasper soft studio & Kava chart libraries.
- TE: Java, Oracle, Unix, IBM MQ, Jaspersoft BI, JSP, struts, Hibernate, Spring, Weblogic application server