We provide IT Staff Augmentation Services!

Ab Initio/informatica/ Bo Developer Resume

0/5 (Submit Your Rating)

VA

SUMMARY

  • 12 years of experience in IT in all the phases of software development life cycle.
  • Over 3 years of experience with Big Data Analytics and Platform setup and migration.
  • Over 8 years of experience with Ab Initio, Informatica, Oracle, and other data integration solutions across platforms.
  • Ability to work in multi - vendor implementation team environments and managing IT teams and enabling successful solution delivery under budget and on time.
  • Exceptional analytical, problem solving skills and flexible to learn new technologies in the IT industry towards company’s success.
  • Excellent communication and interpersonal skills, positive attitude and perseverance to undertake challenging jobs.
  • Experience in defining and documenting impacted business processes, Conceptual data model, Class diagrams and facilitating/conducting design, domain and enterprise review sessions.
  • Experience with end-to-end solution architecture for reporting and analytical data warehouses.
  • Experience in building logical and physical data models and data decomposition diagrams based on data requirements, database administration, and performance tuning of existing processes.
  • Experience with building solution architecture using visio, UML, and white boarding by gathering business and system requirements
  • Experience with leading and managing IT teams and IT projects for successful implementation.
  • Experience with Audit compliance and Change and Management Control Process.

TECHNICAL SKILLS

ETL Tools: Ab Initio 3.0/2, Informatica 9/8/7, Talend, Web Methods 7.1, SAS

ERP: SAP Net weaver

Reporting Tools: Business Objects XI, Hyperion, Microsoft Excel

Big Data: Hadoop 2, Cloudera Manager, Horton works, Amazon AWS, PIG Latin,HIVE,Flume,MongoDB, Sqoop, Solr,Mahout,Zookeeper.

OS: Unix/Linux, Windows Server 2000, XP, 2007, Mac OS, Ubuntu, VMware 6.1

QS/Testing: IBM Requisite Pro, AML HP Quality Center and Test Directory

Data Modeling: Erwin, Visio, Toad, SQL Developer

Automation: Control M, Autosys, Ab Initio Plan, HP Service Manager, Lotus Notes

Programming: IBM Sterling, HBase, HDFS, NoSQL, HIVE, Core Java JDK 7, HTML, XML, C++, KornShell, Perl, PL/SQL, SAS, ABAP/4

Project Management: MS Project, Clarity, Sharepoint, Version One

Web: Magento ecommerce, Volution ecommerce, Authorize.net integration, UPS/USPS/FedEx label print integration, Online POS, Shopping Cart Set-up, CRM, Inventory Management

Access and Identity Management: Oracle IDM, SRM, OIM,OID

Database: SQL Server 2003/2005, Oracle 9/10/11g, Teradata, HBase, MS Access, MVS

Master Data Management: Informatica IDQ, Ab Initio Data Profiler, SAP BW

Methodologies: IBM RUP, UML, Agile, Kanban, Waterfall, SDLC

PROFESSIONAL EXPERIENCE

Confidential, Richmond, VA

Big Data Consultant

Responsibilities:

  • Analyzed large data sets by running Hive queries and Pig scripts.
  • Loading and reading files using HIVE, creating new database objects, storing identity data for more than 65,000 employees for > 3000 application access's.
  • Worked with the Data Science team to gather requirements for various data mining projects
  • Involved in creating Hive tables, and loading and analyzing data using hive queries.
  • Developed Simple to complex MapReduce Jobs using Hive and Pig.
  • Involved in runningHadoopjobs for processing millions of records of text data
  • Worked with application teams to install operating system,Hadoopupdates, patches, version upgrades as required
  • Developed multiple MapReduce jobs in java for data cleaning and preprocessing
  • Involved in loading data from LINUX file system to HDFS
  • Responsible for managing data from multiple sources
  • Extracted files from Oracle/Linux through Sqoop and placed in HDFS and processed.
  • Load and transform large sets of structured, semi structured and unstructured data.
  • Responsible to manage data coming from different sources.
  • Assisted in exporting analyzed data to relational databases using Sqoop
  • Created and maintained Technical documentation for launchingHADOOPClusters and for executing
  • Responsibilities include registering Amazon AWS services. Installing Ubuntu machine (multi node) with additional machines per user, configuration of site XML Files.
  • Installing HBase/HIVE data warehouse, PIG for PIG Latin grunt scripting, and MongoDB region for document saving.
  • Developed custom UDF/UDAFs for calling in HiveQL scripts.
  • Migrating data via transferring files to the Ubuntu server via IBM sterling Connet:Direct for encryption and firewall validation.
  • Utilizing MS Excel Power Query add-in for Business Intelligence reports from HDFS File system.
  • Using Cloudera Manager for monitoring task-tracker, job tracker, cluster health.
  • Used commands like SELECT, CREATE, LOAD DATA, INSERT, LIMIT, and JOIN for data load and manipulation.
  • Utilized partition of datasets by designing partitioned tables for horizontal partitioning, bucketing data for sample analysis.
  • Built metadata driven solution to add dynamically created Teradata and Oracle roles in CTSA to OLAP IAM system by using Ab Initio job and loading data into C-Cloud.
  • Worked closely with Fidelity (GNTS/BANK-RACF) groups for Access and Identity data transactions and audit.
  • SSH key exchange setup between different servers to enable SFTP.
  • Documented and facilitated requirements gathering session, JAD session, Test Case Review, HLD/TOSSG documents for Level 2 support groups.
  • Worked on POC for Continuous flows and building queues for real time data file loads saving time and scheduling issues.
  • Lead for implementing reporting data warehouse for analytical users on Oracle 10g/11g.
  • Designed flat normalized tables for quick cache into forms java server, automate database backup/RMAN scripts on Control-M and integrate them with ETL load batch jobs.

Confidential, Miami, FL

ETL Architect

Responsibilities:

  • As part of BI Quality Assurance team was responsible for developing test cases/scripts.
  • Creating high volume test data using Ab Initio graphs with Generate Records, recording test cases on Quality Center with validation screen shots as attachments.
  • Performed GAP analyses and rollout/implementation planning for the overall project.
  • Performed urgent fix on Informatica mappings, Autosys scheduling list files
  • Web Methods services and maintained the Quality Centre test plans, test scripts and defect tracking on vendor data (Micros) and internal development teams.
  • Utilized Source Qualifier for joins in same database, filter, sorted ports, distinct values, and custom queries.
  • Used Sequence Generator transformation to replace primary keys, and slowly changing dimensions.
  • Used Joiner for normal, master outer join, Detail Outer Join, and Full Outer Join.
  • Made use of Normalizer for COBOL and Relational sources to break out repeated data from single row of data.
  • Built mappings with Router to test input data with multiple conditions and improve performance.
  • Rank transformation to return the strings at the top or the bottom of a session sort order.
  • Strategy transformation to flag rows for insert, delete, update, or reject.
  • Transaction Control transformation to define database level commits within a mapping.
  • The project was to migrate HR and Payroll application from existing infrastructure to the new platform comprising of Informatica and Linux servers.
  • Global Information Management Program comprised of MDM implementation, IT/Infrastructure Audit, Control and Balance analysis and implementation.

Confidential, Mclean, VA

ETL designer/developer

Responsibilities:

  • Automated the entire Data Mart process using UNIX Shell scripts and scheduled the process using Control - M after dependency analysis.
  • Worked in Scrum & Agile Methodology Environment, extensively used the Ab Initio tool’s feature of Component, Data and Pipeline parallelism.
  • Developed the graphs using the GDE, with components partition by round robin, partition by key, rollup, sort, scan, de-dupe sort, reformat, join, merge, gather
  • Working with Departition components like Gather, Interleave in order to deportation and Repartition the data from Multi File accordingly.
  • Was actively involved in project scope discussions, setting/achieving Timelines/deadlines, developing remediation plans, recognizing Risk factors, ensuring Quality deliverable, Tracking Change through control and setting up communication with introduction of daily work packets. Managed Command Center UK EDW (enterprise data warehouse) production rollout
  • Communicated with Business to understand the project, Interacted with BSA/BA for generating test scripts and test plans.
  • Developed application design. Individually handle Quality System requests, attend to urgent/sev 4 production incidents, complete implementation in production.
  • Supported Release Manger for consolidated Rollout of Data Services as a whole Created Knowledge transition plans and documents for legacy application (EXODUS)

Confidential, VA

Ab Initio/Informatica/ BO Developer

Responsibilities:

  • Developed best practices for Informatica developers, Administrators for folder migration/access.
  • Extensively used Ab Initio Multi file system where data is partitioned into eight partitions for parallel processing.
  • Wide usage of Lookup Files while getting data from multiple sources and size of the data is limited.
  • Developed Generic Ab Initio graphs for data cleansing, data validation and data transformation
  • Worked with Departition Components like Concatenate, Gather, Interleave and Merge in order to de-partition and repartition data from Multi files accordingly.
  • Worked with Partition Components like Partition-by-key, Partition-by-expression and Partition-by-Round Robin to partition the data from serial file.
  • Used Toad to verify the counts and results of the graphs
  • Worked with Teradata utilities such as Mload, Fast load, TPump, fast export to load or extract tables.
  • Implemented business validation rules in staging area by using Ab initio inbuilt functions such as is valild, is null, is defined, is error,is blank
  • Extensively used Data base components such as Run-Sql, Multi Update table, update table and join with DB.
  • Implemented SCD type2 for History maintenance
  • Analyzed business needs (functional and technical specification document) based upon user requirements with extensive interactions with business users.
  • Performed Delta Extraction on the OLTP systems into the DWH tables involved in creating the High Level design (HLD) and Low Level Design (LLD) Documents

We'd love your feedback!