We provide IT Staff Augmentation Services!

Report Studio And Framework Manager Resume

SUMMARY

  • 5.4 years of IT experience in Analysis, Design, Development, Implementation, Testing and Support of Data Warehousing Solutions using Informatica Power center and Cognos8.4 (Report studio and Framework Manager).. High exposure in optimizing and automating application to improve business functionalities. Expertise in Data Warehouse systems and Hadoop Framework and Big Data concepts. Capable to work under pressure and a hard worker.
  • Extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using Informatica Power Center.
  • Experience with dimensional modeling using star schema and snowflake models.
  • Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
  • Extensively worked on Informatica performance tuning identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
  • Proficiency in developing SQL with various relational databases like DB2, Oracle, SQL Server.
  • Involved in Unit testing, System testing to check whether the data loads into target are accurate.
  • Main areas of expertise are analyzing, developing and testing the data warehousing projects.
  • Understand the business rules completely based on High Level document specifications and implements the data transformation methodologies.
  • Ability to meet deadlines and handle multiple tasks, flexible in work schedules and possess good communication skills.
  • Currently working for Confidential as Hadoop developer for DC Nexus - 401k Clients.
  • Having good experience on Hadoop Technologies including MapReduce, Impala, Pig, Hive, HBase, Oozie, and Sqoop over the Cloudera cluster.
  • Involved in deploying the code into production and provided support for the same.
  • Preparing the LLD’S and UTC’s for the ETL process carried out.
  • Also have knowledge on cognos reporting studio and Framework Manager.
  • Good knowledge on HDFS architecture.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center8.6/9.1/10, Informatica MDM and Big Data Edition

Big Data: Hadoop, MapReduce, Pig, Hive, Impala, HBase, Sqoop, Oozie

Reporting: Cognos8.4 (Report studio and Framework Manger)

Database: Oracle10g, DB2 and SQL server 2008, Teradata

SQL Editor: Advanced Query Tool, IBM quest Central

OS: Windows XP/ 2007/NT.

SDLC: AJILE (JIRA)

Scheduler: Control-M

Scripting: UNIX

Others: Advanced Microsoft Excel Functions, SSH (Secure FX, Secure CRT, Putty)

PROFESSIONAL EXPERIENCE

Confidential

Report studio and Framework Manager

Responsibilities:

  • Based on the assigned user stories, development work in Hadoop will be started.
  • Data Ingestion from different sources like BYSL, EA and LIPPER which are separately hosted applications to Hadoop using scripts.
  • Analyse data and propose data model for various investments and clients.
  • Create fact and dimension table based on the design with SCD type-2 model.
  • Develop code to extract various calculations with use analytical functions and perform optimized joins.
  • Generate scripts to export the final table to various layers and finally to reporting environment.
  • Schedule shell jobs, for importing data to HDFS and push the processed data back to HDFS.

Confidential

Report studio and Framework Manager

Responsibilities:

  • Interacted with the Business users to identify the process metrics and various key dimensions and measures. Involved in the complete life cycle of the project.
  • Project life cycle - from analysis to production implementation, with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into different targets.
  • Designed, developed Informatica mappings, enabling the extract, transport and loading of the data into target tables.
  • Created Mapplets, reusable transformations and used them in different mappings.
  • Made use of Post-Session success and Post-Session failure commands in the Session task to execute scripts needed for clean-up and update purposes.
  • Involved in Unit testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
  • Responsible for monitoring the Production status, and ensure the ETL process works as expected; handle user communication around production issues.
  • Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
  • Apply specific business rules to the data as required by reporting and other downstream systems
  • Loaded data from heterogeneous sources like DB2, Oracle and flat files.
  • Designed the workflow strategy and scheduled them through Control-M.

Confidential

Report studio and Framework Manager

Responsibilities:

  • Creating target tables as per the source file schema in the database.
  • Then extracting the different source systems into their respective target systems.
  • Applying business logic to the data staged in the 1st phase and then loading to predefined target systems designed for the respective clients.
  • Validating the transformed data against the business logic written in the form of sql queries fired on the source systems.

Confidential

Report studio and Framework Manager

Responsibilities:

  • Understanding the Business Requirements and Develop an ETL process to load data from source to target.
  • Extensively used Informatica power centre for loading the data from sources involving flat files to target as relational table.
  • Prepared Low Level design document that are used to develop Informatica mappings.
  • Created Informatica mappings, Mapplets, Worklets and workflows.
  • Reviewing mapping, workflows and output data and verifying the data loaded into targets as per the Business Logic
  • Implemented slowly changing strategy to manage the data in the dimension tables.
  • Tuned Informatica mappings and SQL queries for better performance.

Hire Now