Java And Aws Developer Resume
Reston, VA
SUMMARY:
- Experience in developing HTML and JavaScript for client side presentation and, data validation within the forms.
- Experience in utilizing open - source frameworks such as AngularJS, Spring, Hibernate, JSF and JQuery .
- Hands on Experience developing SOAP and REST web services in Java using JERSEY
- Experience working with Responsive web design (RWD) using bootstrap.
- Experience working with MAVEN & GRADLE.
- Extensive experience in using various design patterns such as MVC (Model-View-Controller), Singleton.
- Experience in installing, deploying and testing with multiple Application Servers like WebSphere 8.5/ 6.1/5.1/4.1, WebLogic 8.1/7.0 Tomcat web container.
- Good knowledge in developing applications with advanced java frameworks like Java, XML, JavaScript, JEE, JSP, EJB, JDBC, Servlets, JSP, JPA, Java API for XML Web Services, Spring and Hibernate .
TECHNICAL SKILLS:
Languages: Java (J2SE1.8/1.6/1.5, J2EE 6/5), SQL, PL/SQL, UML2.0, Shell Scripting.
Technologies: JSP2.1/2.0/1.2, Servlets2.x, JavaBeans, JDBC, EJB3.0/2.1, Hibernate3.x/2.x, Spring 4,SOA, JMS1.1, AJAX, JAXB2.1/2.0, JAXP1.x, RESTful and SOAP web services
Web Technologies: HTML/DHTML, XHTML1.1, JavaScript1.x, XML1.0, XSL, XSLT, CSS, Bootstrap, JQuery, Angular JS
Development Tools (IDEs): Eclipse 3.2/3.0/2.1.1, Spring Tool Suite (STS), MyEclipse6.0/5.1.1, NetBeans3.3, MS Visual Studio 2005 Atom and Brackets, Putty, WinSCP.
Web/Application Servers: Tomcat 7.x/6.x/5.x, WebLogic 10.3/9.2/8.1/7.0, IBM WebSphere 8.x/7.x/6.x.
Design Patterns: Micro Service Architecture MVC, Front Controller, Session Facade, Singleton, Business Delegate and DAO patterns, Page Object Model design pattern.
Data Base: Oracle 11g/10g, MS SQL Server 2005/2000, MySQL5.1/4.1, Couchbase DB, Mongo DB.
Platforms: Windows, UNIX, LINUX.
Methodologies: Agile Methodology.
Build Tools: Git, Maven, Gradle.
PROFESSIONAL EXPERIENCE:
Confidential, Reston, VA
Java and AWS Developer
Responsibilities:
- Designed and developed the application using the Agile Methodology by using the issue tracking systems and participated in the daily stand up meetings.
- Used Java programming to connect to the and extract the data from both the databases and also from the Salesforce Instances as per the unique identifiers called Federation Identifiers.
- Extracted the Internal User data from the LDAP like user data, functional roles, profile and compared that with the salesforce instance internal user.
- Similarly, extracted the external user data from Oracle database called RAM, and compared them with the data in Salesforce Instance’s external users.
- Salesforce connector is created such a way that if a user in present in RAM/LDAP and not present in salesforce, SFC will create a new user. If there are any updates on the users it will update in salesforce. Similarly, it also deactivates a salesforce user if it is not present in RAM/LDAP.
- Used Apache ANT to build the java application and hosted in UNIX servers.
- Used the IBM Rational Clear Quest build engine called as ICART to build the jar file and deployed it in the UAT and production Unix servers.
- Used SVN as a version control system. iCart engine pulls all the latest files from SVN and builds a new jar and deploys it in Unix servers.
- Worked in updating the application configuration files which are used for database connections, user’s mapping from RAM/LDAP to salesforce.
- Worked on updating the functional roles in configuration files, these functional roles define the roles, profile, feature entitlements, permission sets, licenses that user have for one particular group in RAM/LDAP
- Worked with the application teams to support their releases, deploying the updated Functional Roles and connection configuration files in UAT and production servers.
- Worked on deploying the new configuration files and also took active part in application shakeout every time a new jar or new XML is deployed in UAT and Production servers.
- Worked on creating the shell scripts which run in windows Network drives, fetches the files from SVN and uploads the files to the UNIX application paths.
- Created self Service portal for the application teams so they can work on UAT directly by running the applications themselves.
- Worked on creating the ability for the app teams to run in preview and run modes in this self-service portal.
- Worked on maintenance of the Unix Servers, like archiving, cleaning, deleting the data in production and UAT including the backup servers.
- Worked on migrating from SVN to BitBucket version control system.
- Created Jenskins Jobs to run the test automation suite jobs.
- Worked on configuring the Test Automation Suite jobs in Sauce labs as it uses the Selenium’s Page Object Model.
- Took active part in requirement analysis and feasibility analysis of the backlogs in every sprint.
- Used JIRA and X-Ray for issue tracking and testing the SFC application respectively.
- Worked on creating the Gherkin for Cucumber test cases.
- Worked on creating the Jenkins jobs to run the self service jobs by extracting the data/files from the application team’s Bitbucket repositories, build the jar from iCART engine and pushes these to the Unix Server’s Run path.
- Worked on migrating the application to the AWS environment using Lambda fuctions, EC2 Instances, Cloud Formation Templates and S3.
- Created the UI for the SFC Self Service portal where App teams gets the basic guidelines of how to use Self-Service and also to implement the ability for the App teams to update the functional roles.
Required Skills: Programming Languages like Java, RDBMS - Oracle, Shell Scripting, Unix Scripting.
Tools: Amazon Web Services (Lambda, EC2 Instances, Cloud Formation Templates, s3), Oracle SQL Developer 4.0, Microsoft Excel 2007, Oracle 11g R2, Eclipse, Sublime Text 2.0, Win SCP, Putty, Tortoise SVN, Beyond Compare, Selenium, IBM Rational ClearQuest, CA Work Automation Tools .
Confidential, Reston, VA
Java and AWS Developer
Responsibilities:
- Designed and developed the application using the Agile Methodology by using the issue tracking systems and participated in the daily stand up meetings.
- Worked on creating the prototype version of the project to test the ability of the software used to send the data to the Sales force marketing cloud.
- Used java programming to create the AWS Lambda jar files and used to connect and extract data from the enterprise database
- Stored the extracted data in the RDS as the staging area and used AWS lambda functions to FTP the data into the SFMC.
- As we had the organizational limitation for the lambda function, the FTP is timing out after limit which left us using the AWS GLUE and EC2 Instances.
- As there is lot of data in the enterprise database, we used the Apache Sqoop to extract the data into the s3 bucket as the CSV files.
- Used the Apache Sqoop and Apache Spark big data framework installed on the Elastic Map Reduce Cluster in AWS.
- Be responsible for creating the EMR cluster with Sqoop installed on it and also adding the OJDBC driver for connecting to the Oracle Enterprise Database.
- Be responsible for extracting the relational data from the oracle database using the Apache Sqoop commands. It also includes applying filters in extracting the data using the Apache Sqoop syntaxes and creating the AWS EMR cluster.
- By connecting to the Master node of the EMR cluster the data is sqooped into the s3 bucket as the source data for the AWS Glue ETL job
- Be responsible in transforming the data which the required by the Salesforce Marketing Cloud. Using the AWS Glue to transform the data and stage it in the Amazon S3
- Using the Crawler to create the database catalog in the AWS Glue which is used to create the database schema.
- Using Python scripting and Spark logic to create the dataframes on which the SQL logic can be applied.
- On Spark dataframes the spark query needs to be created which gives the data that needs to be sent to the SFMC using FTP.
- The deliverable of the AWS glue is staged in the s3 bucket and it needs to trigger the AWS Lambda function which actually encrypts the file and send to the marketing cloud using File Transfer Protocol.
- Using the Cloud Formation Template the above process needs to be automated right from creating the EMR cluster and sending the deliverables to the Sales Force Marketing Cloud.
- Created the EC2 instances to encrypt the data extracted into the s3 bucket.
- Created the EC2 instances to compress the data and encrypt using the PGP for sending into the sales force marketing cloud.
- Used the File transfer Protocol to send the data into the Business units used by the sales force marketing cloud.
- Created Java classes to encrypt and compress the data files in S3 bucket and hosted them in Lambda or Ec2 instances.
- Used the Glue transform and Lambda functions to cleansing the data like merging two records and umerging two records to avoid the duplicates in the database.
- Used the cloud formation template to create the EMR and automate the sqoop and Glue transformations to run the jobs on a daily basis.
- Created the ETL jobs of two different formats in AWS Glue. One to extract entire data from the database and other to extract only the incremental data.
Required Skills: Programming Languages like Java, Python, Scala, RDBMS - Oracle, Apache Spark, Apache Sqoop, PySpark Scripting. SQL.
Tools: Amazon Web Services (Glue, Lambda, EMR, EC2 Instances, Cloud Formation Templates, Athena, S3), Oracle SQL Developer 4.0, Microsoft Excel 2007, Oracle 11g R2, Eclipse, Sublime Text 2.0