We provide IT Staff Augmentation Services!

Senior Systems Engineer Resume


Actively looking for a new opportunity in Big Data Development/ETL Design/Data Warehousing/ Business Analyst.


ETL Tools: Informatica PowerCenter 9.6x, 9.5x, Big Data management, Informatica Cloud, Salesforce

Databases and Utilities: Oracle 11g/10g/, Teradata, IBM DB2, Apex Data Loader

Financial Software: Charles River IMS

Reporting: Tableau


Scripting Languages: SQL, PL/SQL, UNIX Shell Script



Senior Systems Engineer

Skills: Charles River IMS 9.8, Oracle Database, Tableau, UNIX Shell Scripting /Platform: Informatica PowerCenter


  • Experience in Financial Asset load to CRD using ETL batch via Informatica PowerCenter 9.6 x and Oracle Database.
  • Implemented complex financial data design mapping for creating new Asset or updates to existing assets.
  • Maintained mapping transparency as possible to handle pricing file and cash flow systems. E.g: ASM and Bloomberg
  • Emphasized on Design Partitioning and reusability of PowerCenter Mapping to improve performance and timing.
  • Worked on multiple implementations of Data warehouse Designs as per client requirements.
  • Implemented approaches for current state mapping logics of security attributes and security types.
  • Designed PowerCenter mappings, using Informatica transformations (Active and passive transformations) and PL/SQL Procedures using cursors, functions and triggers.
  • Expertise in handling and furnishing various financial data of Fixed Income, Bloomberg, Morning Star, Security Lists, Custodian Accounts and Swaps.
  • Specialized in Requirement analysis, Gap analysis, Incident management, Problem management, Troubleshooting Methodologies and in - depth knowledge of all stages of software development life cycle (SDLC).
  • Created migration shell scripts for file move and file transfer and also Perl Scripts for file watcher.
  • Worked closely with Architect on design issues and suggested solutions for better performance.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
  • Designed approaches for Dimensional Data Modeling, Slowly Changing Dimensions Type I, II & III, ODS, Star/flake Schema modeling, Fact & Dimension tables, OLAP & OLTP Systems.
  • Hands on experience is code migration using Perforce, Atlassian products like Bamboo, Bitbucket and Autosys for Job creation for Nightly Batch and Real-Time Batch.
  • Used dofeeds.pl to implement CRD extensions of event on load and event on finish.
  • Created batch reports on Tableau using Tab CMD and Dashboard connecting to Oracle Database.
  • Involved in build, implementation, testing and maintenance, as well as guide and lead the team to meet the quality and schedule.
  • Working closely with team members in Agile Methodologies and Onsite/Offshore Implementation.


ETL Developer, Atlanta, GA

Skills: Hadoop cluster, IntelliJ, PyCharm, Platform: Informatica PowerCenter/Informatica Big Data management.


  • Effectively worked on Informatica Big Data Management to load data to Hadoop HDFS, Hive and HBase and configured Informatica Big Data Management for column profiling also applied Informatica Spark engine to improve performance.
  • Experienced with Dimensional modeling, Data migration, Data Masking, Data cleansing, Data profiling and ETL Processes features for data warehouses.
  • Handled monitoring of Informatica Big Data workflows with YARN and troubleshoot Scala code in Spark execution plan.
  • Experienced in initial data import and load incremental data to HDFS and Hive, also executed scoop merge for large amount of HDFS data.
  • Worked on Powercenter Maplets/Worklets, delay timers and scheduler jobs for processing Hub related messaging-Q in real time.
  • Designed workflows to receive messages on start, generate tokens for each run and log exceptions using PowerCenter workflows.
  • Maintained Real-Time customer data profile updates via JMS message listener, paging messages to PowerCenter Inbound.
  • Developed Data Synchronization Tasks, Data Replication Tasks, Mappings and Task Flows in Informatica Cloud.
  • Generated Service Connectors, Connections, Processes and Process Object in Informatica Cloud Real Time.
  • Hands on experience in tuning SQL queries for enhancing the database performance.


ETL Developer, Buffalo, NY

Skills: Teradata/ Platform: Informatica PowerCenter, Salesforce CRM


  • Participated in the Design Workshops with the different stakeholders of the application and came up with the finalized approach.
  • Created JCL’s on Mainframes to run BTEQ Scripts and scripted BTEQ using derived tables and Global/Volatile tables for processing legacy data.
  • Worked on Teradata Stored procedures including sparse for Join indexes, Teradata macros and provide execute privileges for users.
  • Used Salesforce cloud to create field and data models on salesforce to interact with ETL Informatica.
  • Extensively worked on Salesforce CRM mappings to generate delta uploads and initial loads to salesforce targets and monitoring jobs on uploads.
  • Used different types of Source imports in PowerCenter (FlatFile, COBOL and Excel).
  • Worked on Java Transformation in ETL Informatica to send messages on Hub services. Java code designed interacted with Informatica to monitor transactions and send messages via JasperSoft reports.
  • Generated Custom stored procedure transformation in mappings and interacted with Informatica MDM analytics in running the workflows.
  • Experienced in using PerForce as code version control for Migration.
  • Used Apex Data Loader platform to verify data loads and also created sandbox for pre-test runs from Salesforce.
  • Extensively used Unix Shell scripts for FTP/MFT, updating Parameter files with connections of Source, Targets and Lookup’s.
  • Designed Technical design document based on the business requirements.


Informatica ETL Developer

Skills: Informatica Power Center 8.6.1, Oracle10g, Oracle SQL Developer, MOAD, FLOAD, SQL/SQL Plus, UNIX (Sun Solaris).


  • Worked with source databases like Oracle, SQL Server and Flat Files.
  • Worked on Day to day administration activities like user creation, folder creation and configuring DB connections.
  • Extensively worked with various Active transformations like Filter, Sorter, Aggregator, Router and Joiner transformations.
  • Extensively worked with various Passive transformations like Expression, Lookup (connected and unconnected), Sequence Generator, Mapplet Input and Mapplet Output transformations.
  • Developed Mapplets to implement business rules using complex logic.
  • Used Informatica Repository Manager to create Repositories, User Groups and Users based on their roles.
  • Converted the PL/SQL Procedures and SQL*Loader scripts to Informatica mappings.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size and target based commit interval.
  • Developed UNIX shell scripts to automate the data transfer (FTP) process to and from the Source systems, to schedule weekly and monthly loads/jobs.
  • Used Informatica Designer to create complex mappings using different transformations to move data to multiple databases.
  • Design and Development of pre-session, post-session and batch execution routines to run Informatica sessions using Informatica Server manager.
  • Used Debugger to check the errors in mapping.
  • Worked on Informatica Slowly Changing Dimension (SCD) I, II, III for connected lookup (LKP).
  • Generated UNIX shell scripts for automating daily load processes.
  • Managed Change control implementation and coordinating daily, monthly releases and reruns.

Hire Now