Application Development Consultant Resume
Eden Parrie, MN
SUMMARY
- Hands on experience in development of ETL processes along with BigData Tools for extracting data from various sources, transforming and loading into Confidential systems using DataStage, Talend and Hadoop Ecosystem tools in Manufacturing, Health Care, Retail and Pharmaceutical Industry.
- Experience in configuring and testing Hadoop Ecosystem components (MapR) and building business intelligence infrastructure to support Optum One business needs.
- Capable of processing, cleansing and applying business rules to large sets of structured UHG Entity data in Hadoop and ETL Talend.
- Executed several POC for BigData ingestion in Optum EODS environment.
- Expert in using Quality Stage for data cleansing and data standardization process using various stages like Match Frequency, MNS, WAVES rules and Investigate Stages.
- Extensive Data Warehouse expertise working using Teradata Utilities on Mainframe (BTEQ, FEXP, FLOAD, MLOAD and TMD)
- Expert in Migrating and creating sub environments for new applications in teh Global Integrating Factory environments.
- Experience in performance tuning of Datastage jobs, capable in identifying and resolving performance bottlenecks at various levels.
- Involved in all aspects of ETL - requirement gathering with standard interfaces to be used by operational sources, Data Cleaning and Data Loading strategies. Designing ETL mappings, documentation and testing teh performance of DS jobs.
- Expert in designing Server jobs using various stages like Sequential File, ODBC, Hashed File, Aggregator, Transformer, Sort, Link Partition and Link Collector.
- Proficient in Data Warehousing techniques for Data Mining (CRCA), Data Cleansing, Slowly Changing Dimension phenomenon.
- Experience in UNIX Shell Scripting as a part of file manipulation, scheduling jobs and text processing in server.
- Proficient in Installation and configurations of Datastage in Server.
- Excellent concepts in OLAP and OLTP Architecture for recent projects in financial transactional Datamarts in Mercedes Benz financial Services.
- Excellent knowledge of studying teh data dependencies using Metadata stored in teh Datastage Repository.
- Excellent in cross-functional and global environments to manage multiple projects and assignments concurrently and also TEMPeffective in client interaction, performing all aspects of SDLC, communication, presentation and problem solving skills.
TECHNICAL SKILLS
- IBM InfoSphere Information Server 8.1/8.5 (DataStage
- QualityStage
- Information Analyzer
- Metadata Workbench)
- IBM DataStage Enterprise and DataStage7.5/7/6.x/5.x/4 (Designer
- Manager
- Director and Administrator)
- Oracle 8i/9i/10g
- DB2/UDB 7.x/8.x
- ERwin 7.x/4.x/3.5
- SQL PLUS
- SQL *Loader
- PL/SQL
- Toad 8.0/9.0
- Autosys
- C/C++
- Sun Solaris
- Unix
- IBM AIX
- Unix shell scripting
- MS Windows 95/98/NT/2000/2003/XP
- Hadoop
- MapReduce
- HDFS
- Hive
- Pig
- Scoop
- Oozie
- Flume and Talend.
PROFESSIONAL EXPERIENCE
Confidential, Eden Parrie, MN
Application Development Consultant
Responsibilities:
- Created several complex and efficient ETL Jobs in combination of UNIX scripts to perform quality and business checks as required by teh Optum One business.
- Integrate hadoop into traditional ETL for accelerating teh extraction transformation and loading massive structured and unstructured data.
- Performed a POC for a pilot Design and develop BigData analytics platform for processing Heath care data viewing preference using Hadoop, Hive and Pig for Optum One customer.
- Analyzed data by performing Hive queries and running Pig script to know newly ingested data behaviour.
- Implemented several successful POC for ETL Webservice related to Health care IMS clients, using SOAP base webservice.
- Worked extensively with Sqoop for importing and exporting data from Oracle into HDFS and Hive.
- Automation schedule planning of network of complex ETL jobs which need to run on Tivoli Works Station
- Accountable to meet ETL development timeline and documenting Operation manual specification handover to Production support team
- Validating source to Confidential documentation along with data analysis team to freeze teh changes column level mapping and start teh Development phase.
- Created and automated ETL jobs for teh outbound files to be sent to HEDIS and Optum One customers.
- Resolving and assigning defects to teh offshore team if created by QA team.
- Running, monitoring and reducing lead time for ETL code throughput.
- Made new efficient UNIX environment changes like increasing logical nodes and a cost TEMPeffective solutions for shared network vs pay per usages environment.
- Mentoring and Guiding junior developers onsite and offshore
- Issue resolving and tracking teh progress in teh ETL Automation
- Parallel activities works for future release of teh various inbound entities which have been acquired by Confidential
- Propose time line for development activities and development resource estimates for each release.
Environment: DataStage 7.5/8.1/8.5/8.7, 11.3 SQL server 2008, DB2, Oracle, XMLs, Webservice, Unix Scripting, Control M, Hadoop Ecosystems: MapReduce, HDFS, Hive, Pig, Scoop, Oozie, Flume.
Confidential, Minneapolis, MN
ETL Developer / Domain Engineer
Responsibilities:
- Created and Modified ETL codes, Added columns with required data types in datastge jobs as a change requested from teh BI team.
- Build Web service datastage jobs from one of teh Confidential web site to generate store location information based on zip code and create a sequential file dat is consumed daily by teh BI team.
- Created High Level Design documents for all teh new ETLs and their interactions with teh downstream applications related to Keystone R2.
- Initial/preliminary Analysis as a part of Discovery Phase to get number of datastage job estimates for data integration process which is replacing a legacy project management tool (Prolog).
- Interacting with IBM and CFI team for their SAAS application hosting solutions and conducting Gap analysis for a new Application (IBM TriRiga) implementation which is replacing legacy system (Prolog).
- Creating ODS Usage Analysis, coordinating in Data mapping exercise for various applications in teh current environment which are depending on legacy system (Prolog).
- Analysis of SQL replication jobs in teh legacy system, finding their daily schedule and column level data mapping from Source to ODS.
- Identifying critical Datasets for data migration plan as a part of Initial Load in teh CFI online hosted solution.
- Listing all critical views/tables access by each report and plan for report remediation when teh new application will go live.
- Loading test data in teh DB2 environment from Prod to Stage, Test and Dev Environment for testing team to execute their application testing at various levels.
- Importing and Exporting data in DB2 Database for IEX project which is in line with teh MFT( Mass file transfer) protocols and fixing defects in teh file as a part of file validation.
- Through knowledge of an application as a Domain Engineering team and presenting it to teh review board for final go live and recommendation.
- Used Teradata utilities to load data from production to Dev and Test environment as a part of In Sync process with production data.
- Hosting and conducting weekly review meetings with offshore team for in-Sync session and remediating any issue and concern related to BI, ETL and MFT data
Environment: DataStage 7.5/8.1/8.5/8.7, SQL server 2005, DB2, Oracle,MicroStratigy,Webfocus, Unix Shell Scripting, Control M, Erwin.
Confidential, Warren, MI
ETL Datastage Developer
Responsibilities:
- Designed Datastge interfaces for ACI Project and migrate teh Application to Confidential GIF environment. Deploying code in ITL, Pre Prod and Prod, Executing SIT Cases in ITL and Pre Prod.
- Creating new environment for ACI Application similar to teh former. Changing teh .profile, configuration file. Datastage project level parameters (.param files) and configuration of Administrator client (user define variables).
- Executing teh Catch up loads and monitoring teh performance in OEM and Datastage Director.Monitoring teh CTL Tables (RUN, AUTOMATION and TABLE BALANCE) for rejected records if any.
- Coordinating with teh source file provided (Axiom) for any discrepancy in teh files and root cause of reject records.
- Editing teh certain files is case of job aborts due to bad records and re running teh batch sequence after placing teh files in teh ftpsources directory run teh sequence manually and monitoring till completion of all teh Batch Sequence.
- Editing teh Crontab entries as perthe file arrival schedule form teh source system.
- Editing mail configuration files to add new members to be notified as teh batch ETL completion report dat need to be send automatically from teh server.
- Configuring teh SMART which place teh files through a secure encrypted format in teh Application Server.
- Creating and executing test plans for ITL and Pre Prod environment
- Preparing and coordinating with teh offshore team for application deployment guide, disaster recovery and mapping documents.
- Coordination with teh Offshore DBA team to check teh connectivity with SMART and Database.
- Creating RFC and coordination teh CAB approvals from teh GM.
Environment: DataStage 8.1/8.5/8.7, Seebeyond v5/5.0.5, SAP PI, Oracle 11gR2, Unix Shell Scripting and Autosys, iCollab, ccm.net
Confidential, Farmington Hills, MI
ETL DataStage Developer
Responsibilities:
- UNIX Administrator for changing password for InfoLease Users
- Production Support and re validation of nightly run Datastage Batch Jobs
- Changing Datastage Codes for new requirement by various team within Confidential USA and Canada
- Autosys Job scheduling and monitoring.
- MQ restarting and maintaining for real time data for EOS users for Canada and USA.
- Datastage Job debugging and performance tuning of server Jobs.
- Manually sending files to different business users as per requirement for on demand files.
- Extracting and Transforming data from UniData Database and loading into a sequential files which eventually are user by reporting team.
- Testing various Datastage Jobs with ID numbers for validation and notify mismatch to teh InfoLease team.
- Sustaining all teh interfaces in EDW and resolving issue, Executing SIT for Release 9.
- Ticket resolving for critical ETLs and Finding teh Root cause analysis for failover and aborts of ETLs
- Deploying codes as a part of RFC for enhancements and coordinating in release 9 into production.
Environment: DataStage 8.0/8.1, InfoLease, Oracle 11gR2, Unidata Database, Unix Shell Scripting and Autosys, EOS.
Confidential, Tempe, AZ
ETL DataStage Developer
Responsibilities:
- Extensively worked with DataStage Processing stages like Transformer, Hash file, Sequential file, Aggregator, Sort, Merge, Link Partitioner, Link Collector, IPC and XML for loading data into Confidential Point of Sales Data Mart.
- Extensively used quality stage like match frequency, MNS Stages, Survive and wave stage for profiling, consolidating and cleaning data of Confidential ’s Club Card customer reward point system.
- Used Information Analyzer for data profiling and check teh quality and structure of teh sample data.
- Developed processes to extract teh source data and load it into teh data warehouse after cleansing, transforms and integrating.
- Designed various Batch Jobs and Sequence Jobs to load data into Point of Sales Data Mart and customers demographics Data Mart. Designed several source to Confidential mapping and also tuned them for better performance.
- Worked on programs for scheduling data loading and transformations using DataStage from source systems to Oracle 10g using PL/SQL.
- Worked on cleaning teh sales data of Confidential Stores, which comes from different region across 43 stores in Arizona by using Quality Stage
- Designed ODS to quickly operations on individual data and Performance tuning of DataStage job to reduce batch run time.
- Designed teh DataStage jobs to extract teh data from Text files, transform teh data according to teh business requirements and load them into Teradata tables.
- Worked with Business Analysts to prepare ETL Mapping Documents.
Environment: DataStage 8.0, Oracle 10g, SQL*Loader, PL/SQL, Erwin 4.0, XML, XSD, RedHat Linux, Teradata, Autosys, UNIX and Windows NT
Confidential, Piscataway, NJ
ETL DataStage Developer
Responsibilities:
- Designing, developing and loafing data into data warehouse by using various processing stages of Datastage like Aggregator, Filter, Funnel, Join, Sort, Merge and Lookup to Design a Source to Confidential mapping for loading tables for a medical device data mart.
- Developed data marts by creating new transformations and load data from various sources like relational databases, Flat files, XML, WSDL documents etc into teh data warehouses.
- Used Datastage Administrator to create Repository, User groups, Users and managed users by setting up their privileges and profile.
- Designed DataStage ETL jobs for extracting data from heterogeneous source systems, transform and finally load into teh Data Marts.
- Created Error Files and Log Tables containing data with discrepancies to analyze and re-process teh data.
- Tuned DataStage transformations and jobs to enhance their performance.
- Wrote PL/SQL statement and stored procedures and triggers for extracting as well as writing data.
- Used teh DataStage Director and its run-time engine to schedule running teh solution, testing and debugging its components, and monitoring teh resulting executable versions on an ad hoc or scheduled basis.
- Created jobs sequences and job schedules to automate teh ETL process.
- Technical support for offshore team for various issues like Confidential mappings and testing.
Environment: AscentialDataStage 7.5, MetaStage (Enterprise Metadata Directory), (Designer, Director, Manager, Parallel Extender) Oracle 8i, PL/SQL, IBM DB2, UNIX Sequential Files, MS Access, Windows NT and UNIX