We provide IT Staff Augmentation Services!

Informatica Mdm Architect /informatica Developer Tool / Data Analyst/idq/ide/data Quality Lead/ Data Modeler / Solution Architect Resume

3.00/5 (Submit Your Rating)

Carrollton, TX

PROFESSIONAL SUMMARY:

  • Over 15 years of experience in Software Development Life Cycle, ETL and Data quality, Trading systems, business requirements analysis, application design, development, data modeling, testing of Data ware housing, database systems, SAP and Informatica Cloud services for various projects/clients from conceptualization to implementation. Shorten ramp up time on projects and initiatives by providing proven solutions, a methodology and best practices
  • Over 12 years of experience in Financial Industry in Investment Management (Equity and Fixed Income), Asset Wealth Management, Retail Financial Services, Consumer Business Banking and Data Governance Lines of Businesses.
  • Experience with data architecting, data mining, large - scale data modeling, data integration and business requirements gathering/analysis for data quality.
  • Strong expertise in installing and configuring the complete Informatica MDM suite (Informatica MDM Hub Server, Hub Cleanse, Platform engineering, Resource Kit and Cleanse Adaptors like Address Doctor).
  • Proficient in the development of solutions using Informatica MDM Hub
  • Created a single, verifiable, consolidated source of data which can be referred across the organization and subsequently to have a single point of entry for all data within the organization into this single source .
  • Analyzing the regulatory restrictions and the need for identification of golden record for Master Data.
  • Designed, configured and delivered MDM hubs across multiple data domains (Customer, Product, Store, etc).
  • Configuring and maintaining various components of the MDM Hub including the schema, staging and landing tables, configuring base objects, Look ups, Hierarchies, display queries, put queries and query groups.
  • Defined the Trust and Validation rules and setting up the match/merge rule sets to get the right master records
  • Expertise in Informatica MDM Hub Match and Merge Rules, Batch Jobs and Batch Groups.
  • Responsible for creating user groups, privileges and roles to the users using Security Access Manager (SAM)
  • Configured Informatica Data Director (IDD) for data governance to be used by the business users, IT Managers and Data Stewards and also implemented IDD using hierarchy configuration and created subject area groups, subject areas, subject area child, IDD display packages in hub and search queries for searching the data.
  • Strong working experience in Informatica PowerCenter 9.6 9.1 8.6X, IDQ/IDE, Informatica Metadata Manager 9.6, Informatica Developer, Informatica Analyst, Power Exchange CDC, Data Transformation Developer, MDM Hierarchy Manager, Web Services, SAP BW services to Extract, Transform and Load the data.
  • Strong expertise in designing and developing Business Intelligence solutions in staging, populating Operational Data Store (ODS), Enterprise Data Warehouse (EDW), Data Marts / Decision Support Systems using Informatica Power Center 9.x/8.x/7.x/6.x ETL tool.
  • Expertise in developing and working on UNIX shell scripts and automating the process by Autosys, Control-M, Maestro and Cron Job Schedulers.
  • Experience in Power Exchange to connect and import sources from external systems like SAP R/3, DB2, Salesforce, Mainframes, and AS/400.
  • Hands on experience using Informatica to integration with SAP HANA.
  • Design and develop Informatica interfaces for SAP systems.
  • Good experience in Trading Application System Support such as Latent Zero Capstone Suite (Minerva, Tesseract and Sentinel) and Muni Perform.
  • Experienced in Installation, Configuration, and Administration most of all Informatica products such as Data Quality, Identity Resolution, Jasper Soft, Power Center, Power Exchange, Data Explorer and Metadata Manager on Client/Server environment along applying Hot Fixes and upgrading the systems from 8.6 to 9.0, 9.1 and then to 9.6 along with initial POC.
  • Extensive experience in Informatica 7.1/8.0/8.6/9.1/9.5/9.5.1/9.6 products, Talend Databases Teradata, Sybase, Oracle, DB2, SQL server and My SQL.
  • Strong Knowledge of methodologies including Star Schema, Snowflake Schema, Dimensional Data Modeling, Fact /Dimensional tables, Slowly Changing Dimensions, Change-Data-Capture (CDC), Push-Down-Optimization (PDO) etc.
  • Experienced in developing External Tables, Stored Procedures, Functions, Packages, Views, Cursors and Database Triggers.
  • Excellent skills in Data Modeling using ERWIN & Visio along with Re-Engineering.
  • Expertise in Data Profiling, Data Quality Analysis/ Solutions by using Informatica Tools.
  • Excellent skills in Application performance tuning, expertise in documenting the process and extensively worked with Informatica Professional Services to solve issues.
  • Configured Proactive Monitoring, IDQ, IIR MDM, MDM HUB and IDD defined rules, built scorecards, applied metrics and grouped into various dimensions in IDQ.
  • Working as an Architect IDQ (Informatica Data Quality) solution based on Data Quality business rule creation, assessment / reporting requirement, implemented data cleansing, tuning, profiling and reports, dashboards to display DQ results using IDQ
  • Extensively worked on developing Mapping Design Documents, Code Review Documents, Business Requirement Documents, Change/Process Documentation, Run Books, Test plans, Project plans, Training, Navigation manuals etc.
  • Strong knowledge in using SQL*Loader, Materialized views, Export and Import Utilities.
  • Strong Knowledge in the Data Base Administration such as designing data-model, implementing relationships, handling security issues, importing and exporting database, managing table-spaces.
  • Extensive experience in Data migration projects along with validating the Data by using scripts/Tools to perform Data Cleansing, Profiling, Metadata, and Partitioning.
  • Involved in providing Extensive production support to various Applications along with server Maintenance projects 24x7.

Technical Skills:

RDBMS: Oracle 11g/10g/9i/8i, SQL Server 2008/2005/2000, MS Access 9.0/7.0, Teradata 12/13, Sybase, UDB DB2, My SqlETL Tools Informatica 10.1/10/9.6/9.5.1/9.5/ 9.1/9.0.1/8.6/8.1/8.0/7.1 products (Power Center, Power Exchange, Data Explorer, Data Analyst, MDM,Data Transformation Studio/Informatica Developer, Work Bench, Metadata Manager and Informatica Lifecycle Management),Proactive Monitoring, Ab Initio (GDE, Co-Operating System and Metadata portal Hub), Talend

BI/OLAP Tools: Cognos 9.0/8.0, Business Objects XI, Hyperion Essbase, Tableau, Oracle Reports

Operating Systems: UNIX (AIX, HP/UX), Windows NT/ 2008/2000/2003/ XP/Vista/7

Data Modeling: Logical/Physical Data Modeling, IBM Rational Rose, ERWIN 7.3.10, Microsoft Visio 2004, 2007,2010

Programming Languages: SQL, PL/SQL, C, C++,Java, C-shell, K-shell, Perl

Process/Methodologies: ETL Methodologies/Processes- Data Profiling, Data Quality, Data Services, Change-Data-Capture (CDC), Push Down Optimization (PDO)

Version Control Tools: Informatica, Tortoise SVN, Visual Source Safe and Win CVS

Schedulers: Control M, Autosys, Maestro and Cron

Other Tools: Toad, Oracle SQL Developer, Embarcadero Rapid sql, Putty, Winscp, Telnet, SFTP, FTP,NDM process, MS Project Planner, Citrix Environments

WORK EXPERIENCES:

Confidential, Carrollton, TX

Informatica MDM Architect /Informatica Developer Tool / Data Analyst/IDQ/IDE/Data Quality Lead/ Data Modeler / Solution Architect

Responsibilities:

  • Installed and Configured Informatica MDM 10.1 hub on Jboss application server
  • Worked as an Informatica MDM Architect to install and configure Informatica and MDM suits
  • Establish POC of new ideas
  • Evaluate Proof of Technology(POT) for new architecture and tools
  • Interacted with the Business Users & Enterprise Data Architects in the analysis & design of conceptual and logical data model that is required for solid MDM implementation & ensuring that it meets the business requirements and also supports the products capabilities.
  • Involved in all phases of the project SDLC planning, design, documentation, configuration, testing activities and code migration.. 
  • Configured Base Object Tables in schema and relationship tables in hierarchies according to data-model
  • Creating Landing, Staging and Base Tables and defining Foreign Key relationships, Lookups in the Hub
  • Developing the Mappings to move the data loaded from Landing to Stage by using various cleanse and address doctor functions.
  • Developed the Match paths, Match Columns and Match rules (Exact and Fuzzy) in the Match and Merge Process
  • Working with Hierarchy Manager to develop to Hierarchies as per the business needs.
  • Creating IDD application as per the business needs by creating the subject area and subject area groups.
  • Developing Put and Display packages for both Hierarchy Manager and IDD Application.
  • Defining the Match and Merge Rules in the MDM 10 Hub by creating Path components, Columns and rules.
  • Configured the Automation process for Land, Stage and Load Process (Batch Process)
  • Used Metadata manager to Import and Export of Metadata and Promote Incremental changes between environments from development to testing phase
  • Worked closely with the Business in gathering the Match rules and created the Match rule document
  • Used Informatica data quality tool to cleanse and standardize the data while loading to staging tables.
  • Ran the Match and Merge set up and fine-tuned the Match rules
  • Worked Security Access Manager (SAM) for creating roles and assigning users.
  • Executed MDM stored procedures and batch groups for MDM hub.
  • Apart from the above, we also configured IDD, Reporting & Dashboard services for future utilization of the client needs.
  • Integration of MDM Hub with external applications using SIF APIs.
  • This role carries primary responsibility for problem determination and resolution for each SAP application system database server and application server.

Environment: Informatica Platform 10.1,10.0,9.6,9.5.1,Oracle 12G,SQL Server Management studio, SQL Plus, Windows XP and AIX UNIX, Agile Methodology, HP Quality Center, MS Sql Server, SAP Version: 3.1.H, Flat Files, PL/SQL and Unix Scripting.

Confidential

Informatica Cloud implementation

Responsibilities:

  • Built the platform to implement Informatica Cloud to extend the advantages of enterprise cloud computing.
  • Veeva CRM was built on Salesforce.com to extend the value of their CRM investment through the platform.
  • Developed Apps that handles the tasks to perform Data Replication, Data Synchronization, and Data Mapping with Data Transformation logics and Scheduled them to run on reoccurring basis.
  • Worked on cloud to cloud and on -premise to cloud integration.
  • Used my experience of Data Quality techniques in building the Apps with Data Assessment, Scorecards and Data Masking.
  • Used available transformation to build complex mappings to load the data into salesforce analystics.

Environment: Informatica 9.6, MDM 10 and 10.1, Cloud, Oracle 11G, Agile Methodology, MS Sql Server, Flat Files, SAP, Informatica Cloud Services.

Confidential  

Sr Consultant

Responsibilities:

  • Built the platform to implement proactive monitoring on power center and Data Quality.
  • Installed Proactive monitoring 3.0 with Rule Point and RTAM 6.1
  • Customized Out of box Rules for both Power Center Operations and Governance
  • Built the new one's as per their requirement.
  • Configured, profiled and applied out of box data quality rules provided by the product and helped them in understanding the process along with reference data.
  • Installed and configured content based data dictionaries for data cleansing, parsing and standardization process to improve completeness, conformity and consistency issues identified in the profiling phase.
  • Configured Address doctor content on both PC and IDQ servers and helped users in building the scenarios.
  • Integrated Informatica Data Quality (IDQ) with Informatica PowerCenter and Created POC data quality mappings in Informatica Data Quality tool and imported them into Informatica power center as mappings, mapplets.
  • Configured Analyst tool (IDE) and helped data stewards or business owners in profiling the source data, create score cards, applying inbuilt DQ rules and validating the results.
  • Configured Data Director for further profiled analysis to help data stewards or business owners

Environment: Informatica Platform 9.5.1,Oracle 11G,SQL Server Management studio,SQL Plus, Windows XP and AIX UNIX, Agile Methodology, MS Sql Server, Flat Files, PL/SQL and Unix Scripting, Cloud.

Sr Consultant

Confidential, Dallas TX

Responsibilities:
  • Administration: Installation, configuration and administration of all the above products of Version 9.6/9.5.1 on UNIX platform and then applied Hotfix3.
  • Metadata Manager: configured Metadata Repository to create the lineage for Erwin data models, Databases consisting of Teradata, Oracle, Sql Server, Power center ETL objects and Business Objects reports.
  • Business Glossary: defined and created business terms and categories, rules, usage, data stewards, owners for business terms and finally associated them with data object and produced the lineage.
  • Hands on experience using Informatica to integration with SAP HANA.
  • Security: Configured users, groups, roles privileges and assigned them to each user group, handled data Securities and automation of schedules and LDAP configuration to sync up with Microsoft Active Directory.
  • Created Metadata manager Service, Reporting Service and Reference Table Manager Service in ISP Admin Console to effectively use Metadata Manager. 
  • Worked on cloud to cloud and on -premise to cloud integration.
  • Customization of standard Informatica Meta Model to accommodate a full range of metadata as defined for the collection and storage of Enterprise Metadata.
  • Data Quality: Configured, profiled and applied out of box data quality rules provided by the product and helped them in understanding the process along with reference data.
  • Data Dictionaries: Installed and configured content based data dictionaries for data cleansing, parsing and standardization process to improve completeness, conformity and consistency issues identified in the profiling phase.
  • Address Doctor: Configured Address doctor content on both PC and IDQ servers and helped users in building the scenarios.
  • Integrated Informatica Data Quality (IDQ) with Informatica PowerCenter and Created various data quality mappings in Informatica Data Quality tool and imported them into Informatica powercenter as mappings, mapplets.
  • Analyst Tool: Configured Analyst tool (IDE) and helped data stewards or business owners in profiling the source data, create score cards, applying inbuilt DQ rules and validating the results.
  • TDM: Configured persistent data masking & Subset to mask the sensitive data.

Confidential, VA

Data Quality Basel II, Data Quality SME

Responsibilities:

  • Deliver quality business solutions, including analysis solutions, within agreed time scales, where key progress checkpoints are established, monitored and controlled typically by a Project Manager.
  • Providing high quality detailed analysis, design and build processes to support specific projects with regard to Data Quality profiling and DQ rule coding.
  • Experienced and assisted the team as part of setting up the Data Governance team, Software Development Life Cycle, project infrastructure design, and best practices.
  • Worked with technical/business analysts for DQ requirements, business analysis & project coordination.
  • Assist the business analyst/project manager/data steward in defining or modifying the project Data Quality Rules based on business rules/process.
  • Worked as a data quality architect and administrator to establish a data quality methodology, documenting a repeatable set of processes for determining, investigating, and resolving data quality issues, and established an on-going process for maintaining quality data.
  • Profile source data using IDQ tool, understand source system data representation, formats & data gaps Created Exception handling process and worked on the best practices and standards for exception handling routines.
  • Used Informatica data director (IDD) for viewing the error tables and all the data manipulations (accept, reject records based on the requirements).
  • Proficiency in building scorecards and applying metrics and group the dimensions.
  • Worked in building the reusable mapplets development and coordinated with the data quality development tool.
  • Performed unit, integration, regression and load testing at the database and application level with the tool.
  • Contributed and maintained a library of reusable test plans for future projects/enhancements.
  • Worked in IDQ complex transformations, used address validator, keygen, match, Parser, merge transformations and exception transformations.
  • IDQ is used for data cleansing, Tuning and profiling and Implemented reports, dashboards to display DQ results.
  • Informatica data quality helps make data quality improvement and standardizing on a single platform that provides a centralized set of reusable rules and tools for managing data quality across any project
  • Created scripts for better handling of incoming source files such as moving files from one directory to another directory and extracting information from file names, such as date, for continuously incoming sources.
  • Involved extensively in Unit testing, integration testing, system testing and UAT.
  • Create, backup, restore and DR for Informatica Repositories.
  • Handling space issues on UNIX servers to minimize the repository unavailability.
  • Extensively worked on Informatica deployments across different repositories.

Environment: Informatica Data Quality 9.5.1(IDQ),Oracle 11G,SQL Server Management studio,SQL Plus, Windows XP and AIX UNIX, Agile Methodology, Teradata, HP Quality Center, MS Sql Server, Informatica Data Director 9.5.1(IDD),Flat Files, PL/SQL and Unix Scripting.

Confidential, OH, NJ and NY

Data Management & Analytics, Data Quality Lead

Responsibilities:

  • Experienced and assisted the team as part of setting up the Data Governance team, Software Development Life Cycle, project infrastructure design, and best practices.
  • As part of the Data Quality Management, responsible for designing rules, testing, deploying, and documenting the metrics, profiling results, and maintained them as per the needs of the different LOB’s using Informatica tools such as IDQ, PC, MM etc.
  • Played a key role in creating / managing the Data Quality Process and also in the development of DQ Rules, Profiles, Profile Models and Scorecard for various business requirements.
  • Created many Column Profiles and Profiles Models with heterogeneous source systems.
  • Involved in an ETL development to fetch the above Data Quality Objects results from profiling warehouse to IQM Data Mart. The ETL process involves some challenging tasks to customize the data from Relational and Flat File based Profiles and its results, also the data from Customized Data Objects which created from a SQL Statements. This ETL Process also involves the data fetching from the Abinitio Metadata to get the DQ Rules Requirements.
  • Involved in creating a Metadata Manager in DEV/UAT and PROD environments to pull the Metadata Information for Various Relational databases like Oracle, Teradata, DB2 etc.
  • Involved in developing an ETL process to customize the Metadata details from the Metadata database and Feed into the Abinitio Database to create a Data Lineage for each data source.
  • Developed a UNIX Shell Script which automates the Metadata Manager Load, Column Profiles, Profile Models, DQ Rules jobs and an ETL jobs. The Shell script also takes care of the Status Notification of the Loads, Exception Report with the List of Failed DQ Rules.
  • Created and managed the Control M Scheduler jobs to automate the DQ Process and reduced the manual intervention.
  • Installed and configured proactive monitoring on power center governance to streamline Data Integration Processes and Deliver on SLAs.
  • Created more than 100 Monitoring Rules in all 3 environments.
  • Environment was first setup on Windows environment and then moved to Unix platform.
  • Worked as an Administrator in setting up the DQ Environments in DEV / UAT and PROD, Upgrading the environments from 9.1 to 9.5 and managed the environment like recycling the Data Integration Services, Applying the Hot Fixes, EBF, Infa9Dumps, Infa9logs and customizing the DIS Properties to improve the Performance of the Profile run timings.
  • Installed & configured Informatica 9. 1 Master Data Manager.
  • Configure the LDAP security for MDM multi Domain Hub.
  • Configure the ORS to CMX SYSTEM and worked on setting up tool access and SAM (Security Access Manager) for different ORS.
  • Worked on MDM change list migrations from Dev to Test and PROD environments.
  • Started Data Quality team both Onshore and offshore to establish a data quality methodology, documenting a reusable set of processes, determining, investigating, and resolving data quality issues, establishing an on-going process for maintaining quality data, and defining data quality audit procedures.
  • Create, backup, restore and DR for Informatica Repositories.
  • Collaborated directly with the different lob business data owners to establish the quality business rules that will provide the foundation of the organization's data quality improvement plan.
  • Assisted the business analysts/project manager/data stewards in defining or modifying the project Data Quality Rules based on business requirements.
  • Worked and experienced in end to end project implementations and in creating presentations, framework/navigation/user help/requirement documents and run books.
  • Actively Involved in Design, Develop and Test the Data Quality Rules, Profile Models, Column and Primary Key Profiling, for various data sources to determine root causes, ensure correction of data quality issues due to technical or business processes.
  • Actively Involved in IDQ’s Profile’s Performance Tuning
  • Worked on different measures and reports (in summary and in detail) to management on the progress of data quality improvement.
  • Created the Informatica Metadata load process in all the environments to feed the data to Abinitio team to create a data lineage for a data source in Ab initio Metadata portal/Hub.
  • Created a Unix Shell Scripts and automated them with the help of Control M Scheduler which automated the Profile and Profile Model batch execution for the Data Quality rules, ETL and Metadata Load batch.
  • Responsible for Profile and ETL batch in all the Environments, Migrating the Shell Scripts and Supporting Project files in Unix Servers.
  • Implemented and governed the best practices relating to enterprise metadata management standards.
  • Conducted walk reviews, profiling reviews and guided the users in understanding and using the Informatica Data Quality and Metadata Abinitio tools while ensuring best practices/standards have been followed.

Environment: Informatica Data Analyst 9.1/9.5/9.5.1, Informatica Data Quality 9.1/9.5/9.5.1, Informatica Power Center 9.1/9.5, Teradata 13, DB2,Sybase,MS SQL Server, Oracle 10g/11g, Cognos 9, Toad, Sun Solaris, Windows 2008 Server, k-shell scripts, CTRL-M, JIRA, Abinitio Metadata Portal/Hub 3.0.

Confidential, OH, NJ and NY

Senior ETL Support Specialist

Responsibilities:

  • Worked with Informatica Power Center 7.1/8.1/8.6 for processing the Data to Marts.
  • Worked with the team to understand the Batch jobs need to be held, execute the conversion process and validate the data. All the steps were fully documented, revised and helped/worked with the team with full level of expertise and confidence, and also executed the Data conversion by utilizing the 3rd Party utilities on vendor tools such as Account conversion for Latent Zero Capstone suite for updating the information on Minerva, Muni Perform in trading apps.
  • Involved in Key BC/DR, Infrastructure recycle, BSA refreshes and Regular Database refresh activities from Production to Test and UAT environments, meanwhile I learned the infrastructure environment and various dependencies on the refresh activities and executed the same multiple times without having any issue and later on trained new members of the team.
  • Funds Management and Muni Perform month end’s were manually processed and they are one of the most critical and laborious tasks in IMA. Managed the process successfully as the very critical ASSET Market value reports were generated from the process and are required for the Senor management.
  • Involved in developing Autosys and Control M batch job design and created/modified/developed multiple Control M job streams as demanded and continued my efforts in testing and proved through the successful implementations to production along with ETL-Informatica and Perl scripting.
  • Identified few key areas of improvement and discussed with AD teams and worked with them in testing/validating along with the UAT and Production implementations.
  • Closely worked in the preparation and execution of all monthly Informatica, Database, Java and Control M batch releases - without any impacts to the production environment.
  • Trained and shared my knowledge with my offshore team, so that they are fully capable in handling the overnights and also trained them in control M application development to perform application health checks and many more.
  • Voluntarily I supported many more activities in process as much as I can and worked closely with all the teams to understand and fix the Business requirements or production issues for all the applications.
  • Worked on complete SDLC from Extraction, Transformation and Loading of data using Informatica.
  • Involved in the analysis, design and development of all the interfaces using Informatica Power Center tools in interface team and interfaced with all the other tracks for business related issues.
  • Worked with different source systems like DST (TA 2000), SEI, FETA, NFS, PTS, CDF etc., and loaded data into Warehouse for the purpose business user reporting.
  • Used Debugger to test the mappings and fix the bugs.
  • Responsible for creating business solutions for Incremental and full loads.
  • Involved in creating mappings to load the data from Source to ODS and finally into Data Warehouse tables. Also involved in creating complex mappings to load data into Dimension and Fact Tables using Informatica.
  • Worked with COBOL data source by creating the Copy Books, Flat Files and Relational sources.
  • Worked with different transformations like Aggregator, Expression, Lookup, Update Strategy, Normalizer, Router etc.,
  • Involved in 3rd Level support for Production.
  • Worked extensively with Cognos ReportNet Framework Manager, Query Studio and Report Studio.
  • Involved in developing and maintaining the packages in the Framework Manager.
  • Developed and modified various complex reports according to the business requirement using Query Studio.
  • Involved in fine-tuning the performance of reports by tuning the Tabular SQL used in the reports.
  • Involved in up gradation to Informatica 8.1 and also involved in migration process from DB2 to Oracle 10g.

Environment: Informatica Power Center 7.1/8.1/8.6, DB2,Sybase,MS SQL Server, Oracle 9i/10g ,Perl, Cognos Reports, Business Objects XI, Rapid SQL, Toad, Sun Solaris, Autosys, Control - M, Win CVS.

Confidential, Columbus, OH

Programmer/Analyst

Responsibilities:

  • Developed complex graphs having number of components with emphasis on optimizing performance
  • Performed understanding of requirements, extracting, populating and loading data from Oracle, DB2, to data warehouse database Oracle 8i using Ab Initio as ETL.
  • Designed and used different Transform components in GDE for extracting data from flat files, DB2 and Oracle and loaded into target data warehouse.
  • Implemented Partition techniques using MFS with Partition by Key, Partition by Expression and Round Robin techniques on the data before sending the data through data quality checks.
  • Created Ab Initio graphs based on business requirements using various Ab Initio components such as reformat, rollup, join, scan, normalize, generate records, validate records etc.
  • Used Ab Initio as ETL tool to extract data from source systems, cleanse, transform, and load data into target databases.
  • Created graphs to load data, using components like Aggregate, Rollup, Filter, Join, Sort, Partition and Departition.
  • Tested graphs using Validate Components like Compare-Records, Check Order, Validate Records and Generate-Records.
  • Created checkpoints, phases, and worked with multi-file data for parallel processing using Ab Initio (ETL tool) to load data from flat files, SQL Server and Oracle.
  • Coded and implemented shell scripts for migration of database from production to development system.
  • Performed loading data into Warehouse tables from staging area using SQL * loader.
  • Monitored Ab Initio jobs within deployment environments using Ab Initio commands.

Environment: Ab Initio (GDE 1.0.8, Co>OS 2.0.8), Oracle 8i, SQL, PL/SQL, SQL * Loader, Control M, Shell Scripting and UNIX

Confidential, Charlotte, NC

Programmer/Analyst

Responsibilities:

  • Designed & developed various Stored Procedures, Functions, Packages and Database Triggers.
  • Gathered client requirements, validated and analyzed client data to find trends and relationships.
  • Designed and implemented Autosys jobs to automate the all the tasks which will execute every day at 11:00 PM onwards on nightly basis.
  • Maintenance and production support for various AUTOSYS jobs in the process of automation.
  • Design, Development & Enhancement of user interface and reports using Forms 6i and Reports 6i as well with Dynamic Reports in ASP for AART module
  • Loaded data from the files got in different formats into the Dev, Test and Prod server systems using SQL*Loader.
  • Performed Shell Scripting using VI editor.
  • Created External DBlink’s to access data form 3rd party tables/from Remote databases and interfaced with our own tables.
  • Implemented advanced modeling and optimization techniques to solve analytical problems in the retail optimization area.
  • Involved in writing complex stored procedures (Oracle) to facilitate functionality for invoking through VB and Java applications. Wrote complex queries using SQL.
  • Developed a web-based application that allowed access to the Oracle Business Reports.
  • Maintained all documentation and spreadsheets for Oracle Database operations. Prepared documentation for database operations standards.
  • Handled large volume of data, most of the tables is having records over 100 million.
  • Developed server side, automated E-MAIL packages, Forms, Reports.
  • Troubleshooting development problems.
  • Creating test plans and test scenarios for the functionality and performance testing.
  • Performed User Acceptance Testing and end user training
  • Subsequent alterations of design based on user requirement changes.
  • Working along with the DBA for major part of the Data Base Maintenance tasks.
  • Implementation of coding and documentation standards to be followed and done according to SDLC standards.

Environment: Oracle 9iAS, SQL *Plus, PL/SQL,SQL *LOADER,TOAD,ERWIN,VB 6.0, ASP, XML, Java, Forms & Reports, Unix Shell Scripts, Autosys, JIL, HP-UNIX(superdome), Windows NT and Citrix.

Confidential, Columbus, OH

Programmer

Responsibilities:

  • Extensively involved in creating database procedures, functions, and triggers using PL/SQL.
  • Written C/SQL system to send host mail notification to users who failed to log time to LAN-based project management system
  • Designed and coded C/SQL program to find records lost during a data transfer.
  • Also responsible for coding and modifications to enforce business rules and to improve performance of the system.
  • Worked on different subsystems like Alarm Subsystem, Download Subsystem, Rate Subsystem and billing Subsystem.
  • Design, Development & Enhancement of user interface and reports using Oracle Intelligence Discoverer.
  • Develop PL/SQL scripts and UNIX scripts to manage the interface between two different platforms.
  • Involved in writing complex stored procedures to facilitate functionality for invoking through Java application.
  • Involved in integrating Online (Java application) and Batch (Data stage jobs) processes using MQ Series.
  • Involved in developing the Tools for System Test Team using Pro*C, PL/SQL.
  • Wrote generic error packages to handle exceptions.
  • Configured roles to prevent non-admin users from accessing database objects outside context of application.
  • Wrote large batch programs to run various subsystems.
  • Query Optimization (Performance Monitoring & Tuning)
  • Created Database Triggers for System Integrity
  • Reports were developed using Oracle reports.

Environment: Oracle, Java, Oracle Forms 6i, PL/SQL, Pro *C, MS Office, Toad, UNIX.

Confidential

Programmer

Responsibilities:

  • Design, Development & Enhancement of user interface and reports using Forms 6i and Reports 6i.
  • Designed a database for the user information along with administrator.
  • Developed Procedures and functions for Hospital Based Information.
  • Created the data from different locations and storing in database using Java.
  • Developed packages and procedures to handle front-end forms and reports using developer 2000.
  • Involved in Partitioning of data and indexing of partitioned tables.
  • Defused and resolved problems occurred while designing the databases played a major role in completing the project.
  • Defused and resolved problems occurred while designing the databases played a major role in completing the project.

Environment: Forms 6i, Reports 6i, Oracle 8i, PL/SQL, SQL*Plus, SQL*Loader, HTML Java, VSS.

We'd love your feedback!