Lead Data Developer/big Data Engineer Resume
North Brunswick, NJ
PROFESSIONAL SUMMARY:
- Overall 16 years of experience in IT including 13+ years of strong experience in Data Analytics Developments, Data Modeling, Deep Data Analysis, SDLC on Enterprise Data Warehousing and Business Intelligence, Big Data - Hadoop Ecosystem with hands-on Developments, Leading, Interacting and Managing experience in ETL/ELT architectures on Data Modeling, Data Integration, Data Migration, Data Analytics, Data Architecture, Master Data Management, Metadata Management and Data Governance implementations with Well-Practiced in Roadmap Planning, Best Practices, Standardization, and Data Strategies/Principles.
- Hands on 3+ years of experience on Building HDFS - Data Lakes using Apache Hadoop, HDFS, Spark, MapReduce, Python, IMPALA, HIVE, HUE, Hbase, Kafka, Sqoop, Flume, Solr, Oozie/ZooKeeper, YARN, Cloudera Manager/Enterprise along with Ab Initio Integration with HDFS/YARN environment on Cloudera Distribution for Hadoop(CDH 5.x) and AWS Cloud Computing.
- Having 13+ years of SME level experience in major ETL/BI tools: Ab Initio 3.x.x, Informatica 9.x and Requirement gathering, Design, Estimation, Data Analytics, Architecture, Implementation and Delivering in various SDLC of Agile/Scrum, DevOps, Lean and Waterfall Models.
- More than 10 yrs of Functional experience in Financial - Banking domain, such as Capital Markets, Investment Banking, Credit Risk Management, Asset Management, Basel Regulatory, BCBS 239, Dodd-Frank, CCAR, VaR, CVA, Compliance Risk Assessment, AML Implementation, Analytics Platform and Monitoring, and Compliance Regulatory Reporting.
- Extensive hands on Ab Initio Product Suites: GDE, Conduct>It, Co>Operating System, EME, Technical Repository, BRE, ACE, DQE (Data Profiler), Metadata Hub (MDH), Operation Console, Control Center, Express>It, Query>It, SFD, ICFF, Micrograph, Continuous Flow, Performance Tuning, Code Reviewing, EME Standard and Best Practices in Ab Initio.
TECHNICAL SKILLS:
Technology and Tools: Beginner expertise in BI Tools SAP Business Objects 6.5, MicroStrategy 10, QlikView, Tableau Analytics and MS OLAP. Strong Experience in RDBMS on ORACLE 10g/11g, Teradata 15.x, Sybase ASE 15.x, MS SQL Server 2014, and IBM DB2. Extensive experience in Shell Scripting, AWK, SED, Perl, Python and Control - M/Autosys JIL Programming/Scheduling. Experience in Data Modeling Schemas (RDBMS, Multi-Dimensional Modeling (Star Schema, Snowflake), MOLAP, ROLAP and HOLAP) using CA Erwin Data Modeler r9.x, SAP Power Designer and SQL Data Modeler. Hands on DevOps Tools GitHub, Atlassian Confluence, JIRA, Jenkins, IBM uDeploy and J2EE, MS OLAP, SSIS and SSRS. Having Exposure in MongoDB, Pivotal Greenplum, PostgreSQL, AWS S3, Redshift Spectrum, Quicksight, Athena, Kinesis, Glue in Lambda, EC2, EMR, Dynamo DB on AWS Cloud Computing.
Language(s): C/C++, Python, JCL, J2EE, JMS, XML, UML Modeling, XSDL, SQL, PL/SQL, SED, AWK, UNIX Shell/Perl Scripting
GUI: Visual Basic 6.0, ActiveX, and ADO 2.5.
Big Data Techs: Apache Hadoop2.x, Spark 2.x, Python 3.x, MapReduce, Hbase, HIVE, IMPALA, HUESqoop, Flume, Kafka, Oozie, Zookeeper, YARN MongoDB, Sqoop, Kafka and YARN with
Cloudera CDH 5.x/MapR.:
Data Warehousing - ETL Tools: Ab Initio 3.x, Ab Initio GDE, Ab Initio Co>Operating System 3.x, MS SSIS, Elementum 3.x Talend Bigdata, Informatica Power Center 9.x, Oracle ODI, MS DTS, MS SSIS, MS SSRS.:
OLAP Tools: MS SSAS, SSRS, MicroStrategy 10.x, QlikView 12.x, SAP Business Objects 6.5.
Database Modeling Tool: MS Visio, Erwin Data Modeler r9, Oracle Data Modeler, SAP Power Designer.
Database(s): ORACLE 10g/11g, Teradata 15.0, Sybase ASE 1.5, IBM DB2 UDBEEEMS SQL Server 2000/2005/2008/2014, IBM Netezza 3.0.
Web Technologies: Active Server Pages 3.0, VB Script, and JavaScript.
Web/Application Server: Servlets, IIS 4.0, MTS, JRun Server.
Operating System: MS Office Suites, MS Visio, Windows, Windows NT, SUN Solaris 5.8, HP-UX, IBM-AIX, Red hat Enterprise Linux 3.0 and Cygwin 10.0.
Job Scheduling Tools: IBM Tivoli, Control-M, Autosys, and UNIX Crontab.
Other Tools: Atlassian Confluence, JIRA, Jenkins, IBM UrbanDeploy TOAD 11.5, InterScope 7.0,AquaData, SQuirreL SQL, ScriptIT and EXPRESS IT
PROFESSIONAL EXPERIENCE:
Lead Data Developer/Big Data Engineer
Confidential, North Brunswick, NJ
Responsibilities:
- Gathered all the Sources of CLRTY 2.0 such as EIW, SSDM, SHAW, ECaR, ACAR, HOGAN, DRM and Consumer Lending Mortgages Data of WFHE and WFHM.
- Worked on DevOps, Agile/Scrum Methodology with Product Owner, Senior Management Team, Program Manager, all stakeholders of sources and downstream systems.
- Analyzed various CLRTY 2.0 requirement of all the implementations: SSDM, HELOC, Home Equities, FDR to SHAW, Single ETL, MSP/MAPS and CLRTY DQE/DMC for BCBS 239 DQ initiatives under Master Data/Data Governance Architecture implementation.
- Prepared the FRD (Functional Requirement Document) and Data Level Mapping Design (LLD) documents.
- Interacted closely on all the deliverables and any issues with End Users, Business Analyst, Data Analytics, Program Mangers, Team Leads, DBA Teams, Stakeholders and Tech Leads along with Ab Initio Vendor Team.
- Contributed new architecture of CLRTY 2.0 framework for the replacement of SQL based data warehousing processing to ETL/ELT based and assisting to Senior Data Architects, Consumer Lending Experts, BA and Senior Managers on building Data Strategy/Principles/Architectures (ETL/BI Analytics).
- Working on feasibility of Ab Initio - Hadoop Integration in the environment of CLRTY under EIT, Developed prototypes, Pilot Implementation and Co>operating System 3.3.x integration with YARN on Cloudera (CDH) 5.x Distributions.
- Contributed the Architecture of CLRTY Analytical Applications on Data Lake about Data classification, formats, transformations, sources, targets and persistence mechanism with following the Data Strategies/Principles in the integration of batch processing, Data flow between various components, functional dependencies and middle ware layers.
- Developed Apache Spark based Data analytics batch processing (conversion of ETL) using Python, Spark, PySpark packages, PySpark SQL, RDD, In-memory processing, SparkContext, HiveContext(HiveQL), Aggregation ingesting MSSQL Server DB thru JDBC connection/Sqoop Import/Export features that Oozie work flow helps thru HUE and automate/schedule jobs for data ingestion and data quality(BCBS 239 Implementation) checks.
- Performed the POC and Pilot implementation on High Volume data ingestion and retrievals for building Data Analytical Platforms using Hadoop, Spark, HIVE, IMPALA, Python, MapReduce, Hbase, Sqoop, Flume and Oozie, YARN on Cloudera (CDH 5.x - 25+ nodes, 100 TB per node) Clusters and have processed all the formats of HDFS files: Parquet, Avro, Sequence, JSON, XML, CSV/TXT files.
- Performed the Evaluation of Hadoop infrastructure requirements and design/deployment solutions (high availability, big data clusters, elastic load tolerance, etc.) and Contributed the Automation, Installation and monitoring of Hadoop components Spark, Python, HBase, HIVE, IMPALA, Map/Reduce, YARN, Oozie, Kafka and Sqoop specifically.
- Integrated Tableau with Impala and Hive - POC Implementation for Data Analytics/Visualization Reports and Dashboards along with the downstream Tableau Stake holders.
- Worked with Vendor support team on the issues of infrastructures, Vendor products (Cloudera CDH), Ab Initio and Tableau.
- Managed the Data Lineage, Source Versioning, Data Quality, Logical Models, Physical Models, Impact Analysis, Organized the correlated projects, supporting the Operational issues, Data Strategy complexes in Data Dictionary, Master Data Management, Metadata Management and Data Governance using Ab Initio EME, Express>It, DQE/Data movement Control(DMC), Enterprise Metadata Hub(EMH).
- Performed the Ab Initio/Generic Graphs development/enhancements with these features: MFS Data Parallelism, PDL Approach, MFS, JMS Queue Batch Processing, Component Folding, all the performance optimization of Ab Initio resource utilization on high volume processing and develop the UNIX Shell Scripting using AWK and SED.
- Contributed the BRE (Express>It 3.2) workflow/templates for reusable business application configurations (appconf) and rulesets on various CLRTY 2.0 DQE based implementations for DQ Analytics Dashboards with BA and QA Team.
- Unit testing the Hadoop Analytics Applications/AI ETL Data Analytics, Shell scripting and involved in QA/User Acceptance testing for the deliverables.
- Supported the traditional Job schedules (continuous and batch) on daily basis and monitoring the Oozie/Zookeeper jobs with YARN.
- Optimized the performance issues and impact analysis with end users on data process and analytical solutions providing for any production issues.
Environment: Ab Initio 3.3.4, Apache Hadoop 2.x, Spark 2.x, Python 3.x/2.7, MapReduce, HIVE, IMPALA, Hbase, Sqoop, Flume, HUE, Oozie/Zookeeper, YARN, Cloudera CDH 5.x (Manager/Navigator/Enterprise), MS SSIS, MS SSAS, MS SSRS, Autosys, SQL, PL/SQL, SQL Developer 4.2, Ab Initio Co>Operating System 3.3.2 and 3.2, Ab Initio EME 3.3.2, Express>It 3.3.2, UNIX Shell Scripting, ORACLE Data Modeler 9.0, MS Visio 2016, Perl, AWK, SED, J2EE, JavaScript, JMS Queue, Atlassian JIRA Agile 6.4, Atlassian Confluence 6.4, Anaconda 3.x, Cloud Bees Jenkins Enterprise 2.4, DevOps, Cloud XML, XSD, CVS, Sun OS 5.10, Linux, Teradata 15.0, SQL Assistant 15.0, SQL Server 2008/2014, MongoDB, ORACLE 10/11g, Cygwin 10.0 and Windows 10/Windows 7.
Data Architect/Lead Developer/Analyst
Confidential, NY
Responsibilities:
- Captured all the Sources of EDBI System such as DCS-CM, PDS, PES, CDS-A, CDS-R, SBR (Sales), TDF (Ticket), SDF (Sales), E-Code, GatePass, and Compensation. These are various purpose sources of Airlines System’s operation.
- Interacted on Program Strategy and Roadmaps Planning for this Agile/Scrum Methodology with Product Owner, Senior Management Team, Senior Operation Manager, Directors, Program Manager, and Downstream stakeholders.
- Analyzed the requirement of this program implementation: DCS-CM Feed Processing for Passenger and Bag Activities, PDS/CDS Data Load Replication, Get-Pass, Compensation Processing and OFC Replication Process between Data Centers and Productization of all the implementation.
- Prepared the FRD (Functional Requirement Document), High Level Design, and Low-Level Design (LLD) documents.
- Interacted closely on all the deliverables and any issues with End Users, Business Analyst, Data Analytics, Program Mangers, Team Leads, DBA Teams and Ab Initio Tech Leads along with Ab Initio Vendor Team.
- Implemented the Re-engineering of PDS/CDS Data Load Replication that process the loading of data into EDW and IDW env by single process with Data Modeling of Logical and Physical data model (Star/Snowflake Schema).
- Accomplished 100% deliverables on time in each sprint on Agile/Scrum planning that helped EDBI Program Migration.
- Designed the Dimensional Data Mart/Star Schema Modelling using Erwin Data Modeler 9.0, ORACLE SQL Modeler.
- Performed the Data Lineage, Source Versioning, Logical Models, Physical Models, and Data Analysis using Ab Initio EME, Metadata Hub, and Technical Repository Portals.
- Exclusively worked in XML list/XSD sources using XML Components along with other regular data sources;
- Implemented the Ab Initio/Generic Graphs with these features: MFS Data Parallelism, PDL Approach, MFS, JMS Queue Batch Processing, Component Folding, all the performance optimization of Ab Initio resource utilization on high volume processing and develop the UNIX Shell Scripting using AWK and SED.
- Modified the existing BRE (Express>It 3.2) workflow/templates for reusable business configurations and rulesets on various Customer Rewards Program, Compensation and BI Analytics purpose along with Business Team.
- Developed the SQL, PL/SQL Procedures, Triggers and functions on Teradata 15.x and ORACLE 11i. Also used Teradata utilities: Fast Export, Fast Load, MultiLoad, Tpump, and BTEQ with Ab Initio for high performance and processing.
- Unit testing the Ab Initio Graphs, UNIX Shell scripting and involved in QA/User Acceptance testing for the deliverables.
- Supported the Job Automation schedules (continuous and batch) on daily basis and provide the On-Call support.
- Optimized the performance issues and impact analysis with end users on data process and analytical solutions providing for any production issues.
Environment: Ab Initio GDE 3.2.5, Ab Initio Co>Operating System 3.2.5 and 3.0, Ab Initio EME 3.2 and 3.0, BRE 3.1, ACE 3.1, Express>It 3.2, Conduct>It 3.2, PDL, IBM Trivoli Workflow Programming/Scheduling, Crystal Reporting, SQL, PL/SQL, Python 3.0, SQL Developer 4.2, UNIX Shell Scripting, ORACLE Data Modeler 9.0, Perl, AWK, SED, J2EE, JavaScript, JMS Queue, Atlassian JIRA Agile 6.4, Atlassian Confluence 6.4, XML, XSD, CVS, Sun OS 5.10, Linux, Teradata 15.0, SQL Assistant 15.0, ORACLE 10/11g, Cygwin 10.0 and Windows 10/Windows 7.
Data Architect/Data Engineer
Confidential, Wall St, NY
Responsibilities:
- Proficiency level experience of the Compliance AML/Risk Monitoring Systems: MANTAS, Actimize, Confidential ’s Archer, AML Metrics, GLMS, NPA, Confidential KYC, Mandatory Absence, Trade Surveillance and other AML Monitoring tools.
- Analyzed the requirement of various implementations: COMRAD CR8 Metrics Implementation, Prepaid Cards, MANTAS and Actimize Extractions, Prepaid Cards Metrics Calculation, Trade Surveillance, Non AML Monitoring, Compliance Horizontals (HARA) and Compliance Staffing.
- Gained the Data Analysis of AML Metrics/non AML Compliance, Risk Assessment on building the Compliance Risk Analytical Platforms for the stream of LOB (Line of Business) geographically with Compliance Risk officer and Compliance Management.
- Prepared all the FRD (Functional Requirement Document), High Level Design, and Low-Level Design (LLD) documents.
- Interacted closely on all the deliverables and any issues with End Users, Business Analyst, Compliance Analytics, Compliance Officers, Compliance Regional Teams, DBA Teams and CATE Team along with Ab Initio Vendor Team.
- Achieved the Re-engineering of all COMRAD Applications to move to EDW/BI Architecture with help of Confidential CATE team particularly; Data Modeling, Designed the Logical and Physical data model, separated all the sources as a set of correlated Star/Snowflake Schema Model Streams, Ab Initio graphs, Pre/Post processes, Job Running Wrapper Scripts and Scheduling the streamed jobs.
- Designed the Dimensional Data Mart/Star Schema Modelling using Erwin Data Modeler 9.x, SQL Data Modeler 3.2.
- Performed the EDW ETL BI Architecture that primarily designed the Logical and Physical of Dimensional, Relational Models such as Compliance Staffing, AML Metrics Analytics, MANTAS Metric Calculation and Actimize Analytics, Confidential KYC Analytics with Confidential CATE Standards and Best Approaches of Data Modelling.
- Managed the Data Lineage, Source Versioning, Logical/Physical Models, Impact Analysis, Organized the correlated projects under Master Data Management, supporting the Operational activities using Ab Initio EME, Metadata Hub, Technical Repository Portals.
- Worked on the Prototypes/POC development of the Integration of MANTAS data extraction with EDW-EAP System (Big Data - Hadoop HDFS) thru Apache Hadoop, Spark, Python, Sqoop, Flume, HBASE, MapReduce, HIVE, Oozie with YARN on Cloudera CDH 5.x (20 nodes) clusters and using new feature of Ab Initio Hadoop Components. Also experienced on various formats of Big Data such as: AVRO schemas, Parquet, Hive, and HBASE using Hadoop, Spark, Python and Ab Initio - Hadoop Integration.
- Implemented the Ab Initio/Generic Graphs with these features: MFS Data Parallelism, PDL Approach, MFS, Continuous Flow, Component Folding, Dynamic (ICFF) lookup, all the performance optimization of Ab Initio resource utilization on high volume processing and UNIX Shell Scripting using AWK and SED.
- Developed the Dashboard Platforms, QlikView Scripts, and SQL, PL/SQL Procedures, Triggers on ORACLE, Sybase.
- Unit testing the Ab Initio Graphs, UNIX Shell scripting and involved in User Acceptance testing for the deliverables.
- Performed the issues, problems that caused the batch and continuous runs of 24*7 for supporting all regions users.
- Estimated the resource utilization, AIX to Linux Migration, Data Base/NAS/SAN Space requirements planning with Portfolio Architect Team and CTI Resource allocation teams.
- Managed the Production Deployments and maintained the streamed strategy on the deployment process.
- Provided the End-user support for all the production issues, resolve the transactional data processing issues, configuration issues and work with vendors for the application patch installation/deployments.
- Involved primarily the Disaster Recovery (DR) Test for any disaster failover testing and network problems.
- Optimized the performance issues and impact analysis with end users on data transactional process and analytical solution providing for any production issues.
Environment: Ab Initio 3.2.5, Cloudera CDH 4.x, Hadoop 2.x, Apache Spark 2.x, Python 3.x, Hbase, HIVE, Sqoop, Flume, MadReduce, Oozie and YARN, Ab Initio Co>Operating System 3.2.5 and 3.0, Ab Initio EME 3.2, DQE(Data Profiler) 3.2 BRE 3.1, Conduct>It 3.2, Metadata Hub, PDL, MicroStrategy 10.0, QlikView 12.x, Business Objects, SQL, PL/SQL, Python 3.4, SQL Developer 4.1, UNIX Shell Scripting, CA ERwin Data Modeler 9.0, Oracle SQL Modeler 4.0,, Perl, AWK, SED, J2EE, JSP, JavaScript, Eclipse, XML, SVN, CVS, Sun OS 5.10, Linux, Autosys JIL/Control-M Scheduling, SYBASE ASE 15.x, Netezza 3.0, ORACLE 11g, IIS 7.0 and Windows 2000/Windows 7.
Lead ETL Developer/Data Analyst/Architect
Confidential, NY/Warren, NJ
Responsibilities:
- Proficiency level understanding the CAW System & Upstream source system: Trade Booking Front Office, ACE Server, Aggregator, CVA (Credit Valuation Adjustment), Global Market Risk & CE (Credit Engine) and Global Market Data.
- Analyze the requirement of various implementations of CAW for ACE - PSE (Pre Exposure Settlement), CVA, VaR, EPE, ETE, Dodd-Frank and 0-7 Day CreditRisk Exposure.
- Interaction closely with Business Analyst, Risk Analytics, Credit Risk Officers, Traders, CAW End Users, DBA Teams, CATE Team and Business Stake Holders on Road Map Planning and Strategic Approaches with Best Practices.
- Re-engineering the CAW Application for all the processes particularly Ab Initio graphs, Pre/Post processes and Wrapper Scripts.
- Archived the Re-engineering of CAW for all the processes particularly, Architecture issues, Data modeling, Designed the logical and physical data model, separated all the sources as a set of correlated Star/Snowflake Schema Model streams, Ab Initio graphs, Pre/Post processes and Job Running Wrapper Scripts.
- Prepared the FRD (Functional Requirement Document), High Level Design, and Low-Level Design (LLD) documents.
- Integrated the implementation of 0-7 Day Portfolio, CCAR RWA (Risk Weighted Asset), Dodd-Frank Stress Testing and Basel Regulatory Reports with ACE, CE, and OPTIMA Teams and closely worked with CCAR Team on Risk Analytic and Liquidity Management.
- Developed the Ab Initio/Generic Graphs with these features for each purpose: PDL Approach, Data Profiler(DQE), MFS, Continuous Flow (CAW What-If Processing using JMS service from JBOSS Messaging), Component folding, Dynamic (ICFF) lookup, XML Processing and UNIX Shell Scripting using AWK and SED.
- Implemented the Parallelism approach and EME standards for the existing Ab Initio graphs that avoids the high utilization of resources and performance issues with best practices.
- Developed the SQL, PL/SQL Procedures and functions on Sybase for CAW GUI application data access.
- Unit testing the Ab Initio Graphs, UNIX Shell scripting and involved in User Acceptance testing for the deliverables.
- Performed the performance turning issues & problems that caused the batch and continuous runs for the runs of 24*7 for supporting various regions of trading market (New York, London & Tokyo and LATAM regions).
- Provided the End-user Application support for all the production issues like OTC derivatives, PFE Exposure, What-If Analysis, ETF, 0-7day Portfolio and CEF Reports, Root-cause analysis, resolve the transactional data processing issues, configuration issues and work with vendor for the application patch installation/deployments.
- Supported the Job Automation schedules with on-call support for all trade markets (continuous & batch) on daily basis.
- Accomplished 100% deliverables on time and provided the Credit Risk based Analytical solutions on User support and data requests on any of Trade Portfolio Pricing in What-If for OTC derivatives, ETF and other Trading products.
- Involved primarily the Disaster Recovery (DR) for any disaster failover testing and network problems.
- Prepare the various documents or Process Design, Job Procedures and End-to-End Process document.
Environment: Ab Initio GDE 3.0, Ab Initio Co>Operating System 3.0, Ab Initio EME 3.0, DQE(Data Profiler) 3.0 Elementum 3.1, Conduct>It 3.0, PDL, Autosys JIL Programming/Scheduling, SQL, PL/SQL, DBVisualizer 7.2, SQL Developer 3.2, Data Modeler 3.2, TOAD, UNIX Shell Scripting, CA Erwin Data Modeler 9.0, SAP Power Designer, Perl, AWK, SED, Python, J2EE, JSP, JMS Messages 2.0, JBOSS Messaging Services, JavaScript, JRun Server, Eclipse, XML, SVN, CVS, Sun OS 5.10, Linux, Crontab, SYBASE ASE 15.x, ORACLE 11g and Windows 2000/Windows 8.
Lead ETL Developer
Confidential, Irving, TX
Responsibilities:
- Strong Understanding of the Source Application systems and Back ends, and analyzed the requirements and business rules for development/enhancement.
- Designed the Metreo Vision Application Vector Cubes for functional requirements and business rules.
- Analyzed and involved in the high-level design and prepared the Low-level Design (LLD).
- Developed complex mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookups, Filter, Router, Sequence Generator, Update Strategy, and Joiner.
- Optimized performance by tuning the Informatica ETL code as well as SQL
- Developed the Ab Initio Graphs, Metreo Vision Vector cubes, QlikView Reports, Dashboard bulletin and MIS Reports UNIX Shell Scripting using AWK, SED and Batch Job Processing and Scheduling by CRONTAB.
- Developed the SQL, PL/SQL Trigger, Procedures and functions on ORACLE and Teradata.
- Developed the Ab Initio Graphs for loading/unloading using FastExport, FastLoad, MultiLoad, Tpump utilities and BTEQ.
- Provided the end-user support for all the production issues, resolved the transactional data processing issues, configuration issues and worked with the vendor for application patch installation/deployments.
- Prepared the various documents for Process Design, Job Procedures and End-to-End Process document.
- Involved in deployment, patch installation for Metreo application vision in all the environments, and maintained the versioning control for all the developments and deployments.
- Tuning the performance and the prod issues with end users on data transactional process.
Environment: Informatica Power Center 7.2, Ab Initio GDE 2.14, Ab Initio Co>Operating System 2.14, Metreo Vision 3.2, QlikView 9.x, JAVA, JBOSS 4.0, Apache Server, XML, SQL, PL/SQL, Oracle 10g, Teradata 7.2, Toad 9.0, UNIX Shell Scripting, AWK programming, UNIX Cron Job Scheduling, SunOS 5.10 and Windows 2000 Professional.
Lead ETL Developer/Analyst
Confidential, St. Louis, MO
Responsibilities:
- Solid understanding of the source legacy systems and Backend (Oracle, IBM DB2 and Teradata).
- Analyzed the requirements and business rules for development/enhancement.
- Designed the architecture of system for functional requirements and business rules.
- Analyzed and involved in the High Level Design and Low Level Design (LLD).
- Analyzed the Database and Data modeling (Star Schema, Snowflake) as per requirements and business rules.
- Developed/enhanced the Ab Initio graphs (reusable and continuous flow) and Shell Wrapper Scripts (graph execution and Job schedule) and developed the generic/custom graphs for reusable functionalities.
- Developed/enhanced the PL/SQL stored procedures, triggers in ORACLE and Teradata.
- Tuned the performance on execution of graphs, Data Unload and High Volume Data Load (Teradata) and reviewed the graph using Ab Initio’s Review Checklist and EME standards.
- Scheduled the Ab Initio jobs using UNIX Crontab, Autosys and Control-M.
- Prepared the various documents for Job Scheduling Process Design, Job Procedures and End-to-End Process document for the Graphs, Wrapper script and Production Deployment document.
- Involved in deployment architectures, Trouble-shooting, Root-cause Analysis for each prod release and code version.
- Participated in the Production IT Problem Management Process using Standard Processes and Tools.
- Participated in Production Support Problem analysis - provide specific, relevant production information to assist in troubleshooting the problem as Production on call support.
Environment: Ab Initio GDE 1.13, Ab Initio Co>Operating system 2.13, Erwin Data Modeler7.x, COBOL, IMS, JCL, PL/SQL, Oracle 10g, IBM DB2 UDB, Teradata 5/2, Toad 7.6, UNIX Shell Scripting, Perl Scripting, Control-M/Autosys, ScriptIT, Express, HP-UX B.11.11, SunOS 5.9, IBM-AIX.2000.
Senior Ab Initio Developer
Confidential
Responsibilities:
- Analyzed various the BT’s requirements, Business Rules and Source systems according to the implementations.
- Worked in all the types of source feeds: Text files, Databases, Excel, CSV, XML, XSD, Mainframe, JCL, COBOL, DB2, and Microsoft SQL Server.
- Involved in the High Level Design (HLD) and Low Level Design (LLD) as well as Data Level Mapping between source systems and target systems.
- Developed the Ab Initio graphs using various components like Rollup, Reformat, XML parse, XML reformat, Aggregate, Normalize and Denormalize. Also developed the Unix Wrapper Script for the End-to-End runs of post and preprocessing.
- Developed the Wrapper script and schedule on weekly basis using Autosys and UNIX crontab.
- Performed the Unit Testing, QA, and UAT Testing, involved in deployment architectures for each release, and manage the change management.
- Tuned the performance on the graphs, Unload and High Volume Data Load and Reviews the graph using Ab Initio’s review checklist and BT’s EME Standards.
- Prepared the Process Design, Job Procedures and End-to-End Process document for the Graphs and Wrapper script.
- Participated in Production Support Problem analysis - provide specific, relevant production information to assist in troubleshooting the problem as Production on call support.
- Supported the Data Analysis for any production failures/impact analysis that helps to enhance/improve the performance.
Environment: Informatica Power Center 7.x.x, Ab Initio GDE 1.x, Ab Initio Co>Operating system 2.x, EME, Actuate, PL/SQL, Oracle 10g, Oracle Warehouse Builder, XML, XSD, XSD Style Sheet, COBOL, JCL, PL/SQL, Oracle 9i, ODI 9i, Toad 7.6, Unix Shell Scripting, Perl Scripting, Control-M, Autosys, SQL Server 2005, HP-UX, SunOS 5.8, Windows 2000.