Etl Lead Developer & Fircosoft Admin Resume
DallaS
SUMMARY
- Over 15+ years of IT experience in the Design, Analysis, Development, Modelling, Implementation, production support and testing of various applications, Decision Support Systems & Data Warehousing applications in Abinitio & ETL
- Experience in developing strategies for ETL technology using Ab Initio tool in complex, high volume Data Warehousing projects in both Windows and UNIX environment.
- Expertise in all phases of Software Development Life Cycle (SDLC), including project definition, analysis, design, coding, testing, implementation, and support.
- Worked with ODS (Operational Data Source), Data validation and cleansing process using Ab Initio
- Actively Involved in Data Cleaning, Data perturbation, and database design, Data Modelling.
- Strong experience on developing re - usable generic / dynamic Ab Initio applications for high-volume DWH environment.
- Developed re-usable Ab initio Custom Components and extensively worked on parallel processing
- Well versed with various Ab Initio parallelism techniques and implemented Ab Initio Graphs using Data parallelism and Multi File System (MFS) techniques.
- Developed graphs to fetch data from sources: DB2, Oracle, teradata, excel flat file and netezza.
- Experience in using various Ab Initio components such as Join, Reformat, Scan, Rollup, Normalize, De-normalize, Partitioning and De-partitioning components.
- Managers make business decisions based on the information in data warehouses. Decision makers lose confidence when they encounter missing information, data inconsistencies, and unlikely calculations. The only way to build confidence in enterprise data is to measure, correct, and monitor data quality across the enterprise
- Data Quality Assessment, business users create a configuration for a generic data quality application. A data quality configuration has defined a single data source, such as a database table or file with a particular set of fields and Tests for each field that detect when data values are missing or do not meet requirements.
- Good understanding of new Ab Initio features like Component Folding, Parameter Definition Language (PDL), and Continuous flows, Queues, publisher, and subscriber components.
- Proficient in using Shell scripts and SQL for automation of ETL processes
- Good Experience working with various Heterogeneous Source Systems like Oracle 10g, DB2, MS SQL Server, Flat files, and Legacy Systems.
- Extensive experience with EME for version control, impact analysis and dependency analysis.
- Proficient with various Ab Initio Parallelism and Multi File System techniques.
- Extensive knowledge in Dimensional Data Modelling, Star schema, Snow-Flake schema, creation of Fact and Dimension Tables, OLAP, OLTP and thorough understanding of Data-Warehousing Concepts
- Responsible for monitoring the Production Status and ensure the ETL process works as expected; handle customer communication around production issues and Manage pipeline of tickets and ensure on time delivery. Provide Tier 3 support in triaging production issues related to loads, proprietary tools, and UI
- Develop, test, and debug automated tasks (Apps, Systems, Infrastructure)
- Troubleshoot priority incidents, facilitate blameless post-mortems.
- Work with development teams throughout the software life cycle ensuring sustainable software releases.
- Perform analytics on previous incidents and usage patterns to better predict issues and take proactive actions
- Build and drive adoption for greater self-healing and resiliency patterns.
- Lead and participate in performance tests; identify bottlenecks, opportunities for optimization, and capacity demands.
- Respond to system generated alerts/escalations relating to any failures on application platform.
- Perform root cause analysis and identify and implement corrective and preventive measures
- Document standards, processes and procedures relating to best practices, issues, and resolutions
- Participate in the 24x7 support coverage as needed.
TECHNICAL SKILLS
ETL Skills: Ab initio with Co-operating system, GDE 4,3.5.3, Conduct>It
BRE & Control Center: Abinitio DQE, Metadata Hub & Abinitio Plans
Data Profiling: Informatica 9.1
Data Stage: Apache NiFi
Talend ETL Tool: Aws Glue
Spark: KUBERNETES
DOCKER: CI/CD Pipelines
Reporting Tools: Micro Strategy, Tableau
BigData and cloud: Hadoop, Spark 2.0 with Scala, Amazon Web Service
Operating System: Windows, UNIX, MS-DOS, Windows
Languages: SQL, Shell Scripting, Python, Java, Dot Net, Spart Scale, RUST
Database Access tools: Teradata SQL Assistant, TOAD, SQL Developer, Snowflake Cloud DB
Database: Teradata, ORACLE 9i, MSSQL, DB2MS-Access, Netezza, Snowflake Cloud DB Graph DB
Domain Knowledge: Banking, Retail, Insurance, Manufacturing, Healthcare
Version control tools: EME, Visual Source Safe, Clear case
Data Analytics & Machine Learning: Datameer, Neebo
PROFESSIONAL EXPERIENCE
Confidential, Dallas
ETL Lead Developer & Fircosoft Admin
Environment: Fircosoft, Ab Initio 4.01, Version, MSSQL, Conduct>IT, PDL, Oracle, EME, UNIX, Windows XP.XML, WebServices.excel and Ops-console.
Responsibilities:
- Co-ordinate with Client and offshore team for administrative activities on Abinitio ETL and Fircosoft applications.
- Attend critical application issues and maintenance activities
- Meeting with different stake holders to build high availability architecture for real time OFAC screening and test all layers for fircosoft application.
- Ensure the safety and security measurement like taking backups in regular intervals.
- Build process, Criteria, and plan for establishing an understanding of the current state of cardholder data quality across the data sets.
- Build applications one new servers and Clone existing environment
- Create the document with standard procedures to be followed related to latest version of fircosoft tools and utilities.
- Migration of application and database from Solaris to Linux servers
- Onboarding new clients and ETL processing
- Maintain database schemas and large scale of data.
- Knowledge of the production incident should be communicated to the appropriate run team for each application
- The agile team will create a CHG ticket in ServiceNow once the fix is ready for release and must associate the CHG ticket to the INC ticket.
- Scrum master will follow PMO process for JIRA release process.
- A configuration Item is any physical or logical component or service asset IT service information about each CI is stored in a configuration record within the Configuration Management Database.
- The process, policies and procedure related to preparing for recovery or continuation or technology infrastructure, system, and application which ar vital to an organization after a disaster or outage.
- Oncall support - A paging group consisting of members of a particular technical support.
Confidential, TX
ETL Lead Developers & Application Support
Environment: Ab Initio GDE 4.1 & Abinitio V4.0.1 Cooperating system .3.3.3.7 Version, Oracle, PDL, Control Center, EME, UNIX, Windows XP.XML, Mainframe, HDFS, Big SQL, Teradata, Ms-SQL.
Responsibilities:
- Understand the FLJ process in hadoop on boarding as a SOR.
- Implemented the SOR files from dropbox through Abinitio graph.
- Set up the connectivity for inbound & outbound.
- Preparing the UDS entities PG/DM intake forms.
- Working with DSA team for model - Physical & Logical for SZ loads.
- Entity registration though Abinitio graph in UDS portal.
- Implemented the Ingestion process in hadoop using Abinitio graph.
- Implemented the CDC process from UDS entities in Abinitio.
- Implemented the SZ, CZ in hadoop and data sets in HIVE.
- Implemented the teradata loads and MS SQL loads using Abinitio graphs.
- Implemented AUTO data processing system through Conduct -it with Abinitio graphs, scripts, and conditional logic of the system in abinitio plans
- Working in Tech refresh for Abinitio servers moves and server validations with Hadoop platform team.
- Express it used for creating the save files for code promotion.
- Developed scripts which will check the prod and DR system functional sandboxes compare with all objects and generate the email report to prod support team for not sync objects list.
- Implemented the continuous graph for SOR ingestion and TDQ process using Abinitio Queues
- Implemented the Apache kafka continuous graph in hadoop
- Developing the job scripts and Control-M jobs and Audit entities in oracle DB. working the Ig TSM for UAT and PROD migrations
- Resolve all Abinitio dependency analysis issues and making sure we have end to end lineage on
- MDR.Work with ETL COE on common components for Hadoop Ecosystem
- Data validations to make sure data was loaded as per the requirement
- Using AIM GFS automation tool to migrate the Teradata objects to higher environments. Worked on Hive tool to load the data in to Hive tables. Using Express Its automation tool to migrate the Abinitio objects to integration.
- Worked UDS portal to maintain the Hadoop metadata tables and migrating those entities to IST
- Have worked and have good knowledge of Continuous flows.
- Developed Continuous Flow applications using Ab Initio continuous components like subscribe, MQ subscribe, Multi Publish, Publish, Continuous join with DB, Continuous update table.
- Implemented the Continuous> Flows robust interfacing to these legacy solutions through special processing components Subscribe and Publish, and the components also handle rollback and recovery in the event of failures.
- Continuous Flows approach to real-time processing brings together the productivity increases of graphical dataflow implementation, a truly general model for connecting to data streams, and robust mechanisms for reliably check pointing work and coordinating transactions.
- Lead the support team and ensure that the application is running as per the SLA.
- Communicate with stakeholders in case of delays or SLA breaches.
- Maintain and own problem management and incident resolution process.
- Resolve incidents within defined service levels.
- Manage and own application problems and incident life cycle, drive those to permanent resolutions, implement preventive measures to avoid repetitive issues.
- Debug, troubleshoot, root cause analysis, and resolve issues within SLA.
- Resolve complex problems with long term fixes following BI best practices. Collaborate with BI team for solution review of complex bugs and incident resolution.
- Follow up with third party vendors for issue resolution and prevention.
- Perform weekly incident/problem review meetings with application team.
- Responsible for 24 x 7 production system support, including off hours and weekend 'on call' production support responsibilities.
- Daily spot check of system health.
- Working in Tech refresh for software upgrades & server - prod
- Manage and own application release process.
- Support release of BI deliverables into production and perform on demand meetings with BI team supporting issues encountered due to production changes.
- Follow BI change control and continue integration process for changes to production system, owning all changes made to production system.
- Responsible for continued system improvements to help reduce tech debt costs
- Perform custom, ad-hoc data analysis and reporting based on stakeholder requests within SLA.
- Support weekly and monthly customer report processing.
- Conduct regular presentations to team/stakeholders.
Confidential, NJ
Abinito Lead Developer
Environment: Ab Initio GDE 3.1.4 Cooperating system .1.4.4 Version, MSSQL, Conduct>IT, PDL, DB2, EME, UNIX, Windows XP.XML, WebServices.excel and Ops-console.
Responsibilities:
- Attending the Technical discussion meeting and preparing the Low-Level Design Documents.
- Developing the Graphs and Plans based on requirements.
- Preparing the Mapping document based on Data model.
- Prepared the Schedule document for Ops-Console.
- Data quality metrics are computed when you run a data quality configuration. Metrics such as Completeness, Conformity, and Accuracy give the percentage of records that have no issues. Each of the standard metrics has associated issue codes.
- Business users detect issues and view data quality results in Data Quality Assessment, and monitor results and The Data Quality Warehouse which you install and configure separately from Data Quality Assessment makes it possible to compute metrics fordimensionsof your data.
- Data Quality Assessment configure to measure the data quality of an input source. You create a configuration by selecting an application, giving the configuration a name, and providing data in the configuration needed to run the application for the Data Quality, Create Dataset and Create Lookup
- Working with Code Migration team for Dev to QA and QA to Prod.
- Preparing the Unit testing and QA Testing Documents through QC tool.
- Implemented Job scripts in Unix.
Confidential, NJ
Abinito Lead Developer & ETL Tester
Environment: Ab Initio 3.1.3 Cooperating system, GDE 3.1.3 Version, MSSQL, DB2, Teradata, EME, UNIX, Windows XP and BRE.
Responsibilities:
- Designed and Developed Plans and graphs based on Business.
- Designed Jobs in Operation Console.
- Integrated the application in Metadata Hub,
- Working with XML files
- Prepared High level flow design documents
- Prepared Low Level Design documents and Unit testing.
- Worked ETL Tester for this module.
- Working with Transaction Graphs using Began Transaction and Commit Transactions.
- Implemented file watcher scripts in UNIX.
- Implemented jobs in Console.
- Implemented the Business rules in BRE
- Implemented Job scripts in Unix.
Confidential
Environment: Ab Initio 2.15 Cooperating system, GDE 1.15.19 Version, DB2, EME, UNIX, Windows XP, Control-M
Responsibilities:
- Created the Sandboxes in Development and as well as in QA environment using the EME.
- Developed the graphs using the GDE, with components partition by round robin, partition by key, rollup, sort, scanning, dedupsort, reformat, join, merge, gather, concatenate component.
- Developed Ab Initio graphs for Data validation using validate components.
- Involved in monitoring Ab Initio jobs using AB report options
- Divided the Graph into Phases and created checkpoints to avoid Deadlocks and involved in Testing and debugging of the Graphs.
- Using Validate components like, Check Order, Compare Records, validate records, Tested, debugged, and checked the data records.
- Wrote Tivoli Maestro Schedule scripts to schedule the Ab Initio graph.
- Heively used Unix Shell scripting for writing SQL execution scripts in Data Loading Process.
- Involved in Quality Assurance and Quality Auditing
- Defined, developed, and maintained an automation/regression Testing harness
- Involved in Data cleansing Auditing
- Involved in writing Unit Test script and unit Tested graphs as per the script.
- Database testing is performed using TOAD.
- Involved in CMR and Code promotion.