We provide IT Staff Augmentation Services!

Business Analystv Resume Profile

5.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY:

  • Over 4 years of experience in developing and designing of Data warehouse,Data migration, Identity resolution and data quality projects using Informatica products Informatica Power Center 8x/9X .
  • 2 years of experience in Oracle database using SQL, 1 Years of experience in MS SQL Server, 1 Year experience in Tera data and 1 Year experience in MYSQL and DB2.
  • Over 1 year experience in Business Objects.
  • Experienced in Full Systems Development Life-Cycle Implementation of Various ETL Projects.
  • Experienced in Design, Development, Implementation and Testing of Netezza Databases and ETL Processes.
  • Extracted/loaded data from/into diverse source/target systems like Oracle, SQL Server, Sales Force and Flat Files.
  • Database end table design and implementation for Data warehouse and related ETL Processes.
  • Analysis, Design, Development and Implementation of Data warehouse.
  • Extensive working experience in data migration using Informatica Power Center.
  • Good knowledge in Data Warehouse implementation using tools like Informatica Power Center and Admin Console.
  • Proficient in using Informatica workflow manager, workflow monitor, server manager, PMCMD Informatica command line utility to create, schedule and control workflows, tasks, and sessions, PMREP, INFACMD Informatica command utility to export domain objects, users and groups.
  • Experienced in Tuning Informatica Mappings to identify and remove processing bottlenecks.
  • Experienced in using automation tools like Autosys.
  • Experienced in building UNIX shell scripts.
  • Postproduction support, enhancements and performance tuning.
  • Excellent Analytical, Communication skills and Leadership qualities, working in a team and ability to communicate effectively at all levels of the development process.
  • Experienced in setting up environment setup from scratch and installing Informatica on Unix servers
  • Experienced in migration from Teradata to Netezza and remove processing bottlenecks in Netezza.

TECHNICAL SKILLS:

PROFESSIONAL EXPERIENCE:

Confidential

Business Analyst

Description:

VUDU Datawehouse Build

Vudu, a digital content delivery and media technology company was acquired by Walmart in March 2010. The goal of this project is to deliver Vudu Retailer Sales reporting capability to Category Management.Vudu, Inc. is a content delivery and media technology company responsible for Vudu-branded interactive media services and devices. Vudu distributes full-length movies over the Internet to television in the United States of America. It does this with a content delivery network that uses a hybrid peer-to-peer TV technology.

Responsibilities:

  • Designed ETL process, load strategy and the requirements specification after having the requirements from the application users.
  • Maintaining the Release related documents.
  • Involved in data design and modeling, system study, design, and development.
  • Created tables and views as per requirement for the project
  • Designed the functional specifications for source, target, current proposed process and Interface process flow diagram.
  • Created mappings using Informatica Designer to build business rules to load data.
  • Created Folders and placed all the tested ETL, UNIX scripts in the Staging path for Development, QA Production movement.
  • Generated complex SQL queries using joins, sub queries and correlated sub queries.
  • Developed Informatica parameter files to filter the daily data from the source system and implemented dynamic parameterization for all workflow related variables.
  • Prepared the data flow of the entire Informatica objects created, to accomplish testing at various levels unit, performance, Integration.
  • Studied Session Log files to find errors in mappings and sessions.
  • Created UNIX scripts for handling the ftp of source files, to make them execute in sequence as per the time stamp and to archive the processed files for future reference.
  • Created UNIX shell scripts for Automation and parameter file generation.
  • Identify and analyze data discrepancies and data quality issues and work to ensure data consistency and integrity. Performed audit on ETLs and validated source data Vs target table loads.
  • Worked on bug fixes on existing Informatica Mappings to produce correct output.
  • Identified the bottlenecks in the source, target, mapping, and loading process and successfully attended/resolved the performance issues across this project.

Environment: Informatica Power Center 9.1 ,Netezza,SQL Server and UNIX.

Confidential

Team Lead.

Description:

Concurrent Process Engineering

CATMAN is an Online Booking and Movie distribution application for 20th Century FOX which tracks movie releases in theatres across FOX Territories based on which invoicing and Receivables are managed. OCEANS is a Hollywood Software's TDS product. It provides a centralized repository for all theatrical distribution information creating a natural, integrated flow from Planning to Booking, to Gathering and Reporting Grosses, to Billing and Managing Receivables. The OCEANS system will replace the distribution functionality of the current theatrical distribution systems of territories like Mexico, Australia, Spain, Japan, etc.Owing to performance effectiveness andfunctional abilities of Netezza, the data was migrated to Netezza from Teradata, the conversion had to be done keeping on mind and in turn manage with its acknowledgements. Custom built ETL processes will provide Interfacing between Oceans and other internal and external theatrical sources/systems.

Responsibilities:

  • Designed ETL process, load strategy and the requirements specification after having the requirements from the application users.
  • Maintaining the Release related documents.
  • Involved in data design and modeling, system study, design, and development.
  • Created tables and views as per requirement for the project
  • Designed the functional specifications for source, target, current proposed process and Interface process flow diagram.
  • Created mappings using Informatica Designer to build business rules to load data.
  • Created Folders and placed all the tested ETL, UNIX scripts in the Staging path for Development, QA Production movement.
  • Generated complex SQL queries using joins, sub queries and correlated sub queries.
  • Developed Informatica parameter files to filter the daily data from the source system and implemented dynamic parameterization for all workflow related variables.
  • Prepared the data flow of the entire Informatica objects created, to accomplish testing at various levels unit, performance, Integration.
  • Studied Session Log files to find errors in mappings and sessions.
  • Created UNIX scripts for handling the ftp of source files, to make them execute in sequence as per the time stamp and to archive the processed files for future reference.
  • Created UNIX shell scripts for Automation and parameter file generation.
  • Identify and analyze data discrepancies and data quality issues and work to ensure data consistency and integrity. Performed audit on ETLs and validated source data Vs target table loads.
  • Worked on bug fixes on existing Informatica Mappings to produce correct output.
  • Identified the bottlenecks in the source, target, mapping, and loading process and successfully attended/resolved the performance issues across this project.
  • Monitoring the jobs using Autosys tool for scheduling
  • Supporting the bug fixes and other adhoc ticket requests on a Concurrent Process Engineering

Environment: Informatica Power Center 9.0.1, Oracle, SQL Server 2005 and 2008, Flat Files, Autosys, Windows NT and UNIX.

Confidential

Team Lead.

Description:

Migrating the Datawarehouse from Teradata to Netezza

I have successfully completed Migration project for 21st century Fox, Migrate the Catman DW from Teradata to Netezza, by implementing optimal usage of the pushdown feature of Informatica.

Responsibilities:

  • Worked with Informatica Designer of Functional Description, Scope and Detailed functional requirements.
  • Extensively worked in data Extraction, Transformation and loading from flat files, large volume data.
  • Designed and Created data cleansing and validation scripts using Informatica ETL tool.
  • Responsible for creating, importing all the required sources and targets to the shared folder.
  • Worked with Different sources such as Relational Data bases Oracle, SQL Server , Flat files, Sales Force.
  • Created mappings using Informatica Designer to build business rules to load data.
  • Created Folders and placed all the tested ETL, Oracle and UNIX scripts in the Staging path for production movement.
  • Created UNIX scripts for handling the ftp of source files, to make them execute in sequence as per the time stamp and to archive the processed files for future reference.
  • Worked on implementing pushdown for all mapping and thereby reducing the processing time.
  • Scheduled the jobs in Dev and Testing environment using Autosys Scheduler.
  • Prepared the data flow of the entire Informatica objects created, to accomplish Integration testing.
  • Prepared Unit Test Plans.
  • Performed troubleshooting for non-uniform ness.

Environment: Informatica Power Center 9.5.1, Oracle 10g/11, SQL Server,Erwin, MYSQL, Toad 9.1, Flat Files, Windows NT, UNIX, Sales Force.

Confidential

ETL Developer.

Description: Walmart.com is a lot like your neighborhood Wal-Mart store. Wal-Mart features a great selection of high-quality merchandise, friendly service and, of course, Every Day Low Prices.Founded in January 2000, Walmart.com is a subsidiary of Wal-Mart Stores, Inc. The headquarters is on the San Francisco Peninsula near Silicon Valley, where they have access to the world's deepest pool of Internet executive and technical talent..

Responsibilities:

  • Data is extracted from different source systems such as Oracle, Flat files and DB2.
  • Extensively working on Informatica Designer, Workflow Manager, Workflow Monitor, Working on Source Analyzer, Warehouse designer, Mapping Designer Mapplets Designer.
  • Identified and tracked the slowly changing dimension tables.
  • Did write batch scripts to embed Informatica sessions and scheduled the jobs.
  • Worked within a team to contribute to the successful delivery of a Data Warehouse from the ground up.
  • Worked on Informatica tools -Source Analyzer, Data warehousing designer, Mapping Designer Mapplets, and Transformations Developer.
  • Used Source Analyzer and Warehouse Designer to import the source and target database Schema's, and the mapping designer to map source to the target.
  • Worked with Different sources such as Relational Databases like Oracle, SQL server, Flat Files for Extraction using Source Qualifier and Joiner.
  • Created Mappings through various Transformations like Lookup, Joiner, Aggregate, Expression, Filter, Router, and Update Strategy etc.
  • Worked on Informatica Mapping Designer Mapplets Designer, and Transformation Developer.
  • Tested individual mappings in Workflow Manager.
  • Prepared Unit Test Plans.
  • Scheduled the workflows as per requirement of user.
  • Supported user queries against availability of data from Data Warehouse.
  • Performed troubleshooting for non-uniform ness.

Environment: Unix, Netezza, Oracle 8i, Informatica Power Center 9.x.

We'd love your feedback!