We provide IT Staff Augmentation Services!

Data Architect/governance Consultant Resume

0/5 (Submit Your Rating)

Clearwater, FL

SUMMARY

  • Around 13 years of Information Technology work experience in Software Development Life Cycle (SDLC),business requirements analysis, application design, development, data modeling & developing solutions for various projects/clients from conceptualization, road maps, POC, teaching, best practices to implementation.
  • Around 10+years of experience in Business Intelligenceand Data Warehousing solutions with expertise in ETL life cycle - Analysis, Design, Profiling, Build, Development, Data Quality, Testing and Implementation.
  • Around 7+ years of experience in Financial Services Industry - processes, regulatory &business practices andin developing framework solutions for Enterprise InformationManagement, Strategy and Analytics.
  • Experienced in architecting, evaluating, designing anddeveloping key data warehouses, data lakes, data marts, data modelsand leading the end to end architecture of enterprise data management analytics platforms encompassing data governance, data security, data integration, meta data management, visualization, reportingetc.while working with cutting edgetechnologies.
  • Excellentproblem solving and decision making capabilities with effective communication, inter personal and project management skills; a self-starter with ability to work independently&adapt to the needs of the team.
  • Comprehensive Knowledge of Software Development Life Cycle methodologies like Waterfall. Agile etc.
  • Proficient in collaborating with Business, Operations, Applications, Data and Analytics groups to define, establish and run a process that ensures adherence to enterprise data standards and architecture principles.
  • Expert in Data Modeling skills while using Erwin & have a good understanding of Star Schema, Snowflake Schema, Fact/Dimensional tables, Slowly Changing Dimensions, Change-Data-Capture (CDC) etc.
  • Extensively worked on ETL methodologies and Informatica tools like Repository Manager, Power Center Designer, Workflow Manager, Workflow Monitor, Developer, Analyst, Metadata Manager& MDM.
  • Expertise in implementing Data Quality solutions, building Enterprise Data Dictionary & in metadata management using different tools like Collibra, Informatica IDQ, MDM, Developer etc.
  • Extensively worked&developed various mappings using different transformations like Expression, Connected/Unconnected Lookups, Filter, Joiner,Union, UpdateStrategy,Aggregator,Stored Procedures etc.
  • Expertise in handling data--extraction, transformation, loading from and into different relational databases like Oracle, Teradata, SQL Server, and Flat Files, XML filesusing ETL tools.
  • Have hands on experience, usingvarious BI tools like Cognos, Business Objects, SAS and Tableau etc.
  • Haveexpertise in Documentation processand extensively worked on developing different Business Templates,Mapping/Design Documents, Code Review Documents, Data Quality/Governance templates, Change/Process Documentation, Run Books, Test plans, Project plans, Training and Navigation manuals etc.
  • Adept in formulating Test plans, Test cases, Test Scenariosin conjunction with the Testing team and expertise in formulating test approaches withFunctional Testing, and User Acceptance Testing (UAT).
  • Competent in Operations leadership, Onshore/Offshore execution delivery model, Project Management, Business Intelligence Strategy/Delivery and in business led Data Management &Analytics models.

TECHNICAL SKILLS

Databases/Environments: Oracle 11g/10g/9i, SQL Server, Netezza, DB2, MS Access, Teradata Client 13, 12, Hadoop

ETL Tools: Informatica Clients10.x/9.x/8.x(Power Center Designer - Source Analyzer, Target Designer, Transformation Developer, Mapplet/Mapping Designer)(Workflow Manager - Task Developer, Worklet/Workflow Designer) (Workflow Monitor, Metadata Manager), Data Explorer, Data Analyst, Informatica Developer, Address Doctor, Axon

BITools: Cognos 9.0/8.0, SAS, Tableau, BO, Talend, Spotfire, Qlik

Data Modeling: Logical/Physical Data Modeling, E-R Modeling using ERWIN, Microsoft Visio

Other Processes: Rational Unified Process, Spiral, Agile, Data Profiling, Data Quality, Data Services, Change-Data-Capture (CDC), Project Management

Other Tools: MS Project, Toad 10.0/9/8/7, Putty, Winscp, Autosys, Control-M, CVS Version Control, SFTP, FTP process, MS Outlook, MS Office, HP Quality Center, Sharepoint, SVN, VPN

Data Management: Data Governance, Metadata Management, Master Data Management, Data Quality Management, Data Architecture, Data Security, Data Strategy, Analysis

PROFESSIONAL EXPERIENCE

Confidential, Clearwater, Fl

Data Architect/Governance Consultant

Responsibilities:

  • To lead a Current State assessment project, identify the gaps/challenges/issues, provide recommendations for cleanup and document the source to target mappings and data dictionary of client environment's DWH/BI infrastructure inventory.
  • Lead the tactical and strategic analysis initiatives using Information gathering, analytics, recommendations and execution methodology.
  • Develop and Recommend a future state conceptual Enterprise Data Platform Strategy.
  • Define and propose aDW/BI future road map and establish Metadata Management and Data Governance framework/practices.
  • Metadata Management - Define metadata life cycle, process for capturing metadata assets, ingest metadata from various sources, create enterprise unified metadata views, build Business Glossary and Data Lineage using Informatica tools - Enterprise Data Catalog (EDC) and Axon
  • Data Quality assessment and impact analysis evaluation of the metadata and critical data elements
  • Establish Data Governance framework, define process, policies, procedures, roles/responsibilities and assist the client team with implementation of overall Data Governance program and in user adoption of Informatica tools, EDC and Axon.

Environment: Informatica 10.x - Enterprise Data Catalog (EDC), Informatica Axon 5.x, Cognos, SQL, Data Stage, TM1, IDS, Kore, MS Visio, DB2, MS Office 2016 (Word, Excel, Power Point, Project, Skype, Outlook,SharePoint)

Confidential, Columbus, OH

Data Architect/Governance Consultant

Responsibilities:

  • To establish an enterprise data quality program, addressing all aspects of data quality improvement by employing new data quality tools and methods while leveraging the existing Collibra metadata business glossary, as part of the enterprise data governance effort.
  • To develop and implement best-in-class Data Governance & Data Quality Management practices to support the Alliance “Know More. Sell More” brand promise.
  • To deliver a data quality/governance solution using Informatica IDQ software for data profiling, rule design, implementation, building scorecards and dashboards based on KPIs and improving the existing Collibra tool usage for a metadata management solution in association with IDQ.
  • Work closely with EDGC council in establishing best governance practices in alignment with business strategy, for an improved understanding of data through gathering, and documenting of metadata, the ability to measure data quality against business metrics (to be established) and the ability to track and report trends in data quality and issue remediation over time via a consistent, repeatable and expandable process.
  • To lead the data stewards group as a subject matter expert (SME) in building the standards and processes for handling the organization’s most critical data elements (CDEs), in identifying the key performance indicators (KPIs) and establishing the appropriate DQ measures, metrics and metadata management practices.
  • To identify and capture the lineage of critical master data attributes of the KPIsas part of the metadata management proposal, to support the regulatory and compliance requirements.
  • To work with the Enterprise Data Stewards on profiling the critical data sets, identify data qualitybusiness rulesfor the discovered data quality issues, in analyzing the DQ scorecards & dashboards and coming up with data remediation plans, in a timely manner.
  • To lead the data stewards groups in data analysis and ensure the maintenance of business data terms and definitions, technical metadata and data lineage is happening at the Collibra tool, as per best practices.
  • To train and work with IDQ development team on ETL rule development, implementation and deployment.
  • To work in a leadership role to improve the knowledge, skills, and productivity of the teams by providing training, guidance, coaching and feedback.
  • To work with the executive management and project management team in maintenance of project plan, schedules, communication plans, resource plans, responsibility matrix, work breakdown structures; monitor and mitigate risks; identify and assign resources; write and negotiate statements of work etc.
  • To work with the data integration team on the proposed ETL architecture and platform setup strategy. The recommended strategy is to build a future state, ingesting data from internal and external sources and develop a process to clean, standardize, consolidate and enrich customer data that can be used for analytical/predictive purposes.

Environment: Informatica 9.x - IDQ Analyst/Developer client/Power Center/AddressDoctor/Axon, Collibra 4.x, Erwin Data Modeler, Talend, MS Visio, DB2, SQL, IBM Netezza, MS Office (Word, Excel, Power Point, Project, Lync, Skype, Outlook), Sharepoint, VPN.

Confidential, Baltimore, MD

Data Architect

Responsibilities:

  • To lead the State project in consolidating all its legacy systems into a large data warehouse initiative with a master data management strategy.
  • To provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs.
  • To evaluate the leading industry standard MDM and Data Integration (ETL) products/platforms by working with different product vendors like Informatica, Talend, Oracle, IBM etc. by applying the Decision-Analysis-Resolution (DAR) methodology and generate product(s) recommendation for further perusal.
  • To review the data residing in different source systems & legacy applications & build a common data model for mastered data with entities such as Person and Case.
  • To perform a current state assessment of Data Flow diagrams to understand sources, flow and relationship.
  • To provide inputs for the To-Be architecture for MD THINK which comprises of an MDM, Data Integration system, Hadoop Data Lake& several operational applications and legacy systems.
  • To work on building a master file using MDM solution by linking all critical data to one file which will be the common point of reference.

Environment: Informaticatools 9.x, MS Visio, XML Spy, SQL, Talend, MS Office (Word, Excel, Power Point), Outlook.

Confidential, Erie, PA

Data Architect

Responsibilities:

  • Define & update the informational models for multiple commercial, property and casualty insurance products utilizing IBM Insurance Application Architecture (IAA) framework/ACORD standards and align the models to enterprise architecture.
  • Provide expert guidance on gathering & analysis of business data requirements, build the product definitions and model these needs into data models.
  • Responsible for data profiling, DQ analysis & extraction of data in process of defining new product lines.
  • Review the business approved attribute names, definitions, profiling results and make sure they adhere to the enterprise governance and data quality standards and best practices.
  • Attend meetings with business SME’s, data analysts and participate in discussions, while creating the business terms glossary and metadata as part of product definitions.
  • Develop conceptual, logical and physical data models of different insurance product lines, using Erwin.
  • Define Business Object Model (BOM) mappings in Object Orientation paradigm& use them as source.
  • Work with the integration tech teams on interface design models. (xml).
  • Review & deliver the mappings & data models through presentations to the enterprise architecture team and to the application development groups, as applicable.
  • Lead Data Management/Data Governance initiatives & responsible for data assets, definitions, data quality framework, and documentation of all the above.

Environment: Erwin Data Modeler 9.1, Visio, XML Spy, IBM Rational Rose, IBM Data Management Studio, DB2, SQL, IBM InfoSphere, IBM Data Stage, Windows 7, MS Office (Word, Excel, Power Point), Outlook, VPN.

Confidential, Columbus, OH

Responsibilities:

  • Provide insight into the financial and operational impact and risk through Data Analysis, Statistical Modeling, and Business Intelligence.
  • Assisted the CWM/CB process owner in analysis of data systems and federal regulatory requirements for BCBS to develop requirements and plan for implementation of a successful data governance model.
  • Work with CCB DMC, representing CWM/CB DMC’s BCBS Data Certification program.
  • Accountable for CWM/CB’s Basel Committee on Banking Supervision (BCBS) data assets in different data categories, their identification, definition, monitoring, documentation, communication, certification for Regulatory, Compliance and Executive Reporting needs.
  • Provided Stewardship by identifying critical data elements, defining their lineage, analyzing the profiling results & setting the monitoring rules within DMC framework, policies & Data Modeling standards.
  • Oversee the daily data governance activities and identify data gaps, coordinating with data stewards to provide long term solutions.
  • Work with Business SME’s and provide subject matter expertise on delivery of data governance programs and perform detailed analysis of processes and develop process flows.
  • Have a good understanding of master data management (MDM) concepts and data entity types & involved in evaluation of MDM/data quality tools and in providing a recommendation.
  • Lead the development of an enterprise data dictionary for CWM/CB master data and build the Meta & Reference Data Exchange (MRDX) repository, including attributes, ownership, definitions, standards and industry reference data.
  • Work with ETL Informatica Data Quality team on BCBS program’s data profiling submissions, access, monitoring rules, analytics, monitoring Data Quality issues& in building solutions to minimize them.
  • Expertise in identifying root cause analysis, data profiling, data cleansing using different tools.
  • Work/coordinate with CWM & ICDW Tech teams, on handling the different atomic elements & reporting attributes that are in scope of BCBS program, their data migration, their availability in ICDW data ware housing platform& other activities.
  • Participate in Data Management Council/Data Category meetings & contribute towards improvements in Data definitions, Data Quality, Data Profiling & certification of BCBS & other assigned data assets and their meta data solutions.
  • Manage the CCB Meta Tracker; CWM Internal Tracker & MRDX updates of BCBS data that is in scope of CWM & CB lobs.
  • Manage & maintain the different tasks of CWM Data certification in the BCBS integrated project plan, their timelines, dependencies & other activities.
  • Review & responsible for providing BCBS project updates to my CWM Manager as part of our weekly pipeline reports & stakeholder meetings.

Environment: Informatica Power Center Client 9.5.0, Erwin Data Modeler 7.3.10, Informatica Data Analyst, Abinitio Metadata Portal/Hub 3.0, MRDX meta data tool, Teradata Client 13/SQL Assistant, SAS, Windows 7,MS Project, MS Office (Word, Excel, Power Point), Outlook, VPN.

Confidential, Columbus, OH

Data Modeler/ IT Relationship Manager

Responsibilities:

  • Worked as Data Analyst/Modeler in EDW Customer team, on Integrated Customer Data Warehouse (ICDW) impacts in ER Release projects like ER113 and ER213, until April 2013.
  • Worked as Data Modeler for Credit Basel EDW-ICDW Migration Wave-6 program and built the Integration and Semantic Data Models which are built and available in ICDW production space now.
  • Worked as part of EDW Customer Business Analyst team - reviewed the customer requirements, attended stake holder meetings, identified the best practices and worked on proposed new changes for the future state and road map ahead.
  • Analyzed, Created& Documented complete functional and technical design specifications frame work and Mapping documents for Business Requirements of Enterprise Release projects.
  • Worked with variety of stakeholders i.e. application owners, architects, developers, other support teams across the CBB enterprise and helped the project manager with sequencing the DW activities, in meeting the project time lines.
  • Knowledge of logical and physical data models; worked on the existing ICDW data model architecture to accommodate the new and changing customer business requirements of ER and Basel projects.
  • Created the Basel logical and physical data models for ICDW BDW model.
  • Worked with DRB group in terms of new abbreviations, standards, PI data classification, reviews etc.
  • Worked on a data model prototype for Audit frame work metrics.
  • Worked with the IQM Classification process team in managing and adding the new content of our projectsto the existing Classification process of ICDW program.
  • Worked with different lobs in terms of creating Security Profiles for user access of data, Query conversion from DB2 to TD, facilitated the meetings in terms of bringing the EDW users onboard to ICDW.
  • Worked closely with cross-functional business and business intelligence teams to document ETL requirements and turn them into ETL jobs and for reporting needs.
  • Served as Subject Matter Expert (SME) on assigned projects, production support areas and knowledge transfer transition across different cross functional teams.

Environment: Erwin Data Modeler 7.3.10, Abinitio GDE, Informatica, DB2, Teradata Client 13/SQL Assistant, UNIX, Putty, MS Visio 2010, Abinitio Metadata Portal/Hub 3.0, HP Quality Center 10, TOAD 10, Windows 7, MS Office (Word, Excel, Power Point), Outlook, VPN, EURC, ITSM systems

Confidential, Columbus, OH

ETL Tech Lead

Responsibilities:

  • Worked as part of Business Consulting team on the process assessment project of the Data Governance group - reviewed the customer alignment, conducted stake holder interviews, worked on the value/non value add gap analysis, identified the best practices and proposed new changes for the future state and road map ahead.
  • Worked as Tech Lead to plan, design, analyze, develop, code, test, debug and document programming to satisfy business requirements for Data Migration projects between EDW and ICDW programs using various IT tools.
  • Worked as the BAU (Business as Usual) lead in maintaining the status quo, identifying, fixing the code defects and other concerns at the semantic layer, and facilitating to take the Wave 2 program to the business users and in obtaining their user sign-off.
  • Worked with variety of stakeholders i.e. application owners, directors, architects, DBAs, developers, other support teams across the CBB enterprise and help the project manager with sequencing the DW activities, in meeting the project time lines.
  • Experience in performing Impact analysis, Risk Assessment, provided accurate project task estimates and resource planning forecasts along with capacity planning within a value driven Agile approach to project management.
  • Analyzed, Designed & Documented complete functional and technical design specifications frame work, Conceptual/Logical/Physical Data Modeling, Mapping documents for Business Requirements, Reverse Engineering and ETL processes.
  • Knowledge of various data related functions such as Data Integration, Data Modeling, Metadata Management, ETL, Business Intelligence, Data cleansing, and Data Profiling.
  • Working knowledge on Data Modeling design concepts of Dimensional Modeling, Star/Snowflake schema creation, Fact and Dimensions, Normalization, Compression techniques.
  • Knowledge of logical and physical data models; managed the existing data models to evolve and meet the new and changing customer business requirements.
  • Worked with the IQM team in managing and adding the metadata content of the Wave 2 project into the Abinitio metadata systems of ICDW program.
  • Designed and developed Informatica work flows, mappings for data loads, making use of different transformations, lookups, stored procedures, change data capture(CDC), push down optimization (PDO), performance tuning and other ETL techniques.
  • Analyzed the varying requirements for the different layers of the project, and worked across SOR, Integration and Semantic layers of the Wave2 ICDW program.
  • Worked closely with TD database/Informatica/UNIX administrators for database and infrastructure installations, designs, migrations and maintenance across Dev, QA, UAT and PROD environments.
  • Experience in configuration management, code version control, SVN practices, conducted ETL code reviews and made sure quality solutions are builtand best practices/standards have been followed.
  • Expertise in Unit Testing, Integration, Functional, System, Performance, Load, Regression Testing and Validation testing techniques and provided support all through the System Integration Testing, User Acceptance Testing and BAU phases.
  • Served as Subject Matter Expert (SME) on assigned projects, production support areas and knowledge transfer transition across different cross functional teams.
  • Experience in creating; manage a new/older Change Records (CR) to approvals and implement: facilitate them across in QA, UAT and PROD environments using Enterprise Content Management (ECM) and IT Service Management (ITSM) tools.
  • Experience in working relationships with cross functional teams including Release Management, Demand Management, Change Management, Governance & Data Architecture Management, Build & Source Data Management, Production Support teams.
  • Experience with investigating and solving production issues. Associated with Production support team in troubleshooting various production/performance issues under tight deadlines, and SLA’s.
  • Experience in managing geographically distributed & diverse work groups at onsite & offshore locations.

Environment: Informatica Power Center Client 9.1.0 (Repository Manager, Mapping Designer, Workflow Manager, Workflow Monitor), Informatica Developer 9.1, Erwin Data Modeler 7.3.10, Abinitio GDE, Informatica Data Analyst, DB2, Teradata Client 13.0, UNIX, Putty, MS Visio 2010, Abinitio Metadata Portal/Hub 3.0, HP Quality Center 10, Control-M Scheduler, TOAD 10, WinScp, Tableau, Cognos 10, MS Project, Windows XP Professional, MS Office 2010 (Word, Excel, Power Point), Outlook, MS Office Live Meeting, JIRA, CITRIX, Tortoise SVN, VPN, EURC, ECM, and ITSM systems.

Confidential, Columbus, OH

Data Quality Consultant

Responsibilities:

  • Experienced and assisted the team as part of setting up the Data Governance team, Software Development Life Cycle, project infrastructure design, and best practices.
  • Worked effectively with business, technical stakeholders, and data stewards.
  • Workedas part of Data Quality team to establish a data quality methodology, documenting a reusable set of processes, determining, investigating, and resolving data quality issues, establishing an on-going process for maintaining quality data, and defining data quality audit procedures.
  • Collaborated directly with the different lob business data owners to establish the quality business rules that will provide the foundation of the organization's data quality improvement plan.
  • Assisted the business analysts/project manager/data stewards in defining or modifying the project Data Quality Rules based on business requirements.
  • Worked and experienced in end to end project implementations and in creating presentations, framework/navigation/user help/requirement documents and run books.
  • Worked on data profiling and data quality tools, and various data sources to determine root causes, ensure correction of data quality issues due to technical or business processes.
  • Worked on data mapping, data modeling approaches based on logical data modeling etc.
  • Worked on different measures and reports (in summary and in detail) to management on the progress of data quality improvement.
  • Participated in the enforcement of data quality standards, ensuring the quality, completeness and accuracy of data definitions.
  • Working experience with Business Data Management, Informatica Data Quality tools and Abinitio Metadata Management tools.
  • Designed, developed simple to complex Informatica mappings for data loads using different Transformations like Aggregator, Lookup, Expression, update Strategy, Joiner, Router etc to load the extracted data into staging tables and to target Teradata tables, flat files and profile warehouse.
  • Imported metadata from sources as data modeling tools, Databases, ETL tools, Business Intelligence tools, Excel spreadsheets, Data Dictionaries, and validated the imported metadata against the source content.
  • Implemented and governed the best practices relating to enterprise metadata management standards.
  • Monitored metadata Abinitio web portals for accuracy and consistency across various environments.
  • Worked on different environments like Dev, UAT, QA, and in moving the complete tested data to PROD server environment.
  • Collected and linked metadata from diverse sources, including relational databases and flat files.
  • Used SQL tools like TOAD, SQL Developer, and Teradata SQL Assistant to run queries and validating the data.
  • Developed Customized load processes and run them automated, using UNIX shell scripts.
  • Scheduled load processes and automation of Informatica batch jobs using Control-M scheduler.
  • Conducted walk reviews, profiling reviews and guided the users in understanding and using the Informatica Data Quality and MetadataAbinitio tools while ensuring best practices/standards have been followed.
  • Associated with Production support team in various performances related issues. Helped in troubleshoot production/performance issues under tight deadlines and SLA’s.
  • Effectively worked and coordinated with resources at both onsite and offshore locations.

Environment: Informatica Power Center 9.1.0/9.0.1/8.6.1 (Repository Manager, Mapping Designer, Workflow Manager, Metadata Manager), Informatica Developer 9.1, Informatica Data Analyst, Oracle 11g, Teradata client 13.0, Flat files, Cognos 9, XML, UNIX, Toad 10/9, Putty, MS Visio 2010, Windows XP Professional, MS Office 2007, CITRIX, VPN, JIRA, AbinitioMetadata Portal/Hub 3.0, SQL Developer, Winscp, MS Project

Confidential, Columbus, OH

ETL Applications Developer

Responsibilities:

  • Plan, design, analyze, develop, code, test, debug and document ETL process to satisfy business requirements for large, complex projects.
  • Designed & Documented complete functional and technical designs for ETL process, application interfaces, load, purge and archive processes etc.
  • Designed ETL processes to optimize the flow of information across Chase retail lending and services, and other data warehouse entities within Chase in a way to meet their respective SLA's.
  • Created new designs, reviewed existing design, identifying impact analysis, and updating the existing ones with the new enhancements.
  • Installed, configured & upgraded Informatica Power Center 9.1.0/9.0.1/8.6.1 on IBM-AIX.
  • Worked on data mapping, data modeling and data profiling, data quality approaches.
  • Worked on multiple data integration techniques like ETL, ELT, Hybrid ETLT approaches.
  • Developed number of complex INFORMATICA Mappings, Mapplets and Reusable Transformations for the Home Mortgage applications to facilitate Daily, Monthly and Yearly Loading of Data.
  • Worked on different environments like Dev, Test, and in moving the complete tested data to PROD server environment.
  • Collected and linked metadata from diverse sources, including relational databases and flat files.
  • Worked on various CDC approach models for different database source and target systems.
  • Extensively used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager.
  • Designed, developed simple to complex mappings Informatica mappings for data loadsusing different Transformations like Aggregator, Lookup, Expression, update Strategy, Joiner, Router etc to load the extracted data into staging tables and to target tables/flat files.
  • Involved in performance tuning by optimizing the sources, targets, mappings and sessions for better performance in Informatica.
  • Used Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Sessions and scheduling them to run at specified time.
  • Monitored transformation processes using Informatica Workflow monitor.
  • Used SQL tools like TOAD to run SQL queries and validate the data in the ODS
  • Automated the ETL process using Unix Scripting and scheduled PMCMD to interact with Informatica Server from command line mode.
  • Involved in writing/developing UNIX Shell scripts and utilities to run Informatica workflows.
  • Developed Customized load processes and run them automated, using UNIX shell scripts.
  • Worked on performance and tuning of database servers, and UNIX scripts.
  • Scheduled load processes at UNIX level, using Cron.
  • Worked on creating labels, assigning supporting mappings and objects to the labels, assigning dynamic deployment groups, and migrating these objects into test and prod environments.
  • Conductedcode walk through, code reviews and made sure quality solutions are built
  • Identified bottlenecks and tuned Power Center components during development and testing.
  • Worked on different error handling strategies at data and process levels.
  • Worked on creating Run Books for the processes and creating various QA docs as required.
  • Associated with Production support team in various performances related issues. Helped in troubleshoot production/performance issues under tight deadlines and SLA’s.
  • Formulated Test plans, Test Scenarios, setting up of testing environments in conjunction with the Testing team and worked on unit and integration testing.
  • Investigated and fixed the root cause of the problems that are encountered during Production Support on priority basis.
  • Worked and facilitated on WRM, ECM, Risk Analysis, and Release Control Systems for various projects.
  • Coordinated with development resources across, both onsite and offshore locations.

Environment: Informatica Power Center 9.1.0/9.0.1/8.6.1 (Repository Manager, Source Analyzer, Mapping Designer, Mapplets, Transformations, Workflow Manager, Worklets, Workflow Monitor), Informatica Data Transformation Studio 9.1/Work Bench, Informatica Power Exchange 8.6.1, Oracle 11g/10g, Teradata13/12, MQ/BTQ series, PL/SQL, Flat files, Metadata Manager, XML, UNIX, Toad 10/9, Putty, MS Visio 2007, Windows XP Professional, MS Office 2007, CITRIX, VPN

Confidential, MI

ETL Developer/Analyst

Responsibilities:

  • Created business workflow diagrams, Use Cases Diagrams, Activity Diagrams/State Chart Diagrams, Sequence Diagrams, and Collaboration Diagrams following the UML methods by using MS Visio.
  • Provided expertise in strategic planning, business process modeling, business process analysis, object-oriented analysis & design, use case modeling, use case analysis, and quality assurance.
  • Involved in Dimensional modeling to Design and developed STAR Schemas, used ER-win, identifying Fact and Dimension Tables.
  • Analyzed existing transactional database schemas and designed micro and macro level design models.
  • Created and maintained the Requirement Traceability Matrix between the requirements and other products such as design documents and test plans
  • Tuned Teradata SQL Queries and accomplished data migration process that load data from databases, files into Teradata by the development of shell scripts, using Teradata SQL and utilities such as BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD.
  • Developed Complex Mappings using Transformations (Filter, Router, Lookup, Update Strategy, and Expression) on the extracted data according to the Business Rules and Technical Specifications.
  • Created Transformations like Sequence generator, Lookup, joiner and Update Strategy transformations in Informatica Designer.
  • Developed Informatica Mappings and Mapplets to load data using various Power Center transformations.
  • Created transformations starting concurrent batch process in server and performed backup, recovery and tuning of sessions.
  • Worked on SQL tools like TOAD, SQL*Loader to run and load SQL queries and validate the data.
  • Developed Unix/Linux Shell scripts which were used in post session command in Informatica Workflows and also for scheduling the workflows, defining the parameter files and for managing the test cases across development and testing environments.
  • Involved in performing installations, development & maintenance of Informatica services in Unix/Linux environments.
  • Developed Test Scenarios and Test cases in HP Quality Center and involved in unit testing of mappings, system testing and user acceptance testing.
  • Created Autosys jobs by analyzing the entire flow of the process by giving proper dependency conditions.
  • Establishing and monitoring issue management, change management and quality management processes

Environment: Erwin, Informatica Power Center 8.6/8.1, Oracle 10g/9i, SQL 2008, TOAD, Teradata utilities - SQL Assistant,Cognos 7.1, HP Quality Center, Microsoft Visio 2007, MS Office 2003

We'd love your feedback!