: +91 9949712255, 0866 2527558   : info@usharama.com   Not applied for accreditation by NBA

Usha Rama College of Engineering and Technology

NH-16, Telaprolu, Ungutur Mandalam, Near Gannavaram, Krishna District, AP - 521109.

Approved by AICTE, New Delhi | Affiliated to JNTU Kakinada | ISO 9001 : 2015 Certified | NAAC Accredited: "A" grade

Our Partners :

  • IBM
  • Microsoft
  • Oracle
  • Intel
  • TATA
Usharama College Code

Some Ideas to Secure the Job you want

Posted By: Usha Rama Nov-15, 2017 ActivitiesInformativeEngineeringTechTechnology Upgrades
http://image.usharama.edu.in/blog/secure-the-job/job.jpg

some-ideas-to-secure-the-job-you-want

The job hunt is everyone's target once they come out of their studies or college. But what kind of job you need to join, or how the job market is today is what no one cares to look at.

While the jobs market may have improved but there is still often tough competition for the top roles. It’s a horrible feeling to miss out on a job that you really wanted so we’ve put together these tips to help you stand out as the best candidate and secure your ideal position.

Be clear on what you want

This might sound obvious but it’s surprising the number of people who are looking for a new role without really being clear about what an ideal position looks like. If you’re unhappy in your current role, or unemployed, then any job probably seems like a better option but it makes sense to think through what you’re seeking in your next role, and what factors are important to you in terms of job satisfaction.

What level of responsibility are you looking for? What salary do you need? What length of commute would you be happy with? What sort of company do you want to work for? Knowing what are deal breakers for you, and the elements you’d be willing to compromise on, will help you identify what constitutes an ideal position for you and that’s a key first step in being able to secure it!

Be organized

When you’re actively looking for a job it can be hard to keep track of which jobs you’ve applied for, from which job boards and with which recruiters. This is why it’s crucial to stay organized as there’s nothing more off-putting for a recruiter or company than speaking with a candidate who sounds unsure of whether they’ve applied for a particular role or not. They naturally want to feel like the role they’re calling to talk to you about is the most important one on your radar.

Keep a spreadsheet to record your applications including details such as job title, date applied, where you applied for it, name of the recruiter/hiring manager/company including whatever contact information was supplied and a column to list the status of the application e.g. awaiting feedback, rejected, first interview etc. This will also help you to identify which opportunities you need to follow up with if you haven’t received any feedback by a certain point.

Develop good relationships

They say it’s who you know, not what you know and that’s why it’s important to develop relationships with people who can help you land your ideal job. This can include recruiters, people within companies you’d like to work for, people you’ve previously worked with, people in your industry, those you studied with etc.

When you’re busy it can be easy to lose touch with people but social media can help you stay connected. First of all, look to reignite any existing relationships you have. Don’t come straight out and ask if they can help you secure a new job. You can send them a note saying it’s been a while and ask how they are and what they’re up to or comment on something they’ve posted online / share something with them that you think would be of interest. You can then look for other relevant people to connect with. Again, keep your initial outreach friendly, rather than going straight in with ‘can you help me get a new job?’ You should also look for opportunities to develop relationships offline, such as at industry events.

In terms of building good relationships with recruiters, the key is regular and honest communication. Let them know what you’re looking for, what your current situation is, what interviews you’re going on etc. Also, make sure you respond as soon as you can to any opportunities they present to you. If you’re going to be unavailable for a prolonged period of time then let them know this in advance so they don’t think you’re no longer interested in a new role.

Develop strong online presence

If you’re looking for a new job you need to make sure you can be found online and are saying things that portray you in a positive way. Complete your profile information on all social sites you’re on so that you’ll be returned in any recruiter searches. Also, do a Google search for yourself to see what’s returned and if there’s anything that needs removing or privacy settings that need altering.

You want to refrain from bad mouthing your current employer and having any inappropriate photos visible to someone you’re not connected with. Instead, you should be looking to share industry insights, blogging about relevant issues, commenting on existing blogs etc. These things will make you look like you take an active interest in your career and are knowledgeable on current news.

Tailor your Resume

Use the job description to make sure that your CV clearly details that you have the skills that this role requires. It will also help you to prioritize the order that you show these skills in and whether there are elements that you can remove for this opportunity. You don’t need a completely new CV for each role you apply to, but you should tweak your master copy so that the recruiter can see quickly what a great fit you are for this position. This is especially important for CVs that go through ATS systems as they’ll be looking for certain keywords, often ones that are mentioned in the job advert.

You should also make sure that your CV contains quantified achievements, rather than a list of responsibilities e.g. increased profitability by 24% over the fiscal year.

Do your research

Some people save this for the interview stage but it’s important before then. Doing your research about a company will help you determine whether it’s one that you’d like to work for, and if it is then you can use it in your cover letter to help you stand out.

There’s some debate about whether cover letters are necessary anymore but why miss out on an additional opportunity to sell yourself? In this short note, you can detail why you’re excited about the role and why you think you’d be good at it. This is where you can draw on some insights you’ve found from your research.

Demonstrate why you’re the strongest candidate

If you’ve made it to the interview stage then now is your ultimate chance to shine. Through your research and the job description (plus your conversations with the recruiter if this is how you found the job), you should have an idea of why this vacancy has been created and what the company needs from this person. Use these insights to prepare examples of your experience which showcase that you can provide what they need.

There’s nothing worse in an interview than being at a loss to offer an example of where you’ve demonstrated a particular required skill or behavior so doing some thorough preparation before should help ensure that this doesn’t happen.

Conclusion

If you’re working with a recruiter make sure to provide the feedback quickly on how the interview went and on any areas where you feel that you may not have come across as well as you would have liked. So that they can pass this to the employer when they’re speaking with them and it might help to overcome any potential doubts or questions they had about you.

However, if you found the job directly with the employer then be sure to send a concise thank you email which reiterates your interest in the role. Where possible reference something that was said in the interview, for example; It sounds like an exciting time to join the company with the recent acquisition of Z, and I would relish the opportunity to help you achieve your growth plans.

... Read More

What is Cloud Security and Why should we Consider it?

Posted By: Usha Rama Nov-09, 2017 InformativeEngineeringTech
http://image.usharama.edu.in/blog/cloud-security/image.jpg

cloud-security-and-why-should-we-consider-it

Introduction

A cloud refers to an IT environment which has been designed for remote access to IT resources. The term Cloud originated as a metaphor of the Internet which is a network of networks providing access to a remote set of decentralized IT resources. A cloud is accessible through the Internet and there are many different clouds that are accessible through the Internet.

Cloud computing provides several benefits for organizations and users. They are as follows:

  • Almost any type of computing resources can be provisioned on demand.
  • Organizations can scale up and scale down the used resource as per requirement.
  • Users are required only to pay for what resources and workloads have been used.

Whether using any type of cloud service provider, Cloud Security is essential to assess the security of your operating systems and applications running on a cloud. To ensure the ongoing security in the cloud requires a highly equipped cloud instances with defensive security controls to assess the ability and withstand the latest data breach threats.

Here are the following points can help secure a cloud-based deployment.

Understand your shared responsibility: While the cloud security provides a greater part of the virtualization and physical infrastructure, the rest of the responsibility for the infrastructure falls on the organization users. Depending on the services used, it is the user’s responsibility to enforce Application Security, Policies, Configuration etc.

  1. Network Protection: Use in-depth defense and secured services like

  • Virtual Private Networks(VPN)
  • Routing Rules
  • Network ACLs
  • Proxy Servers: Nginx
  • Stateful Firewalls
  • Network Address Translation(NAT)
  • Application: Modsecurity

Host: ip-tables

Network : pfSense

2. Protection of the Cloud Machine Images

  • Harden machine images
  • Change default passwords
  • Disable insecure ports and services
  • Install the AV Software
  • Use a baseline (STIGs) – System specific checklist
  • Learn Security Content Automation Protocol(SCAP): which provides multiple tools for assisting administrators and auditors by enforcing security baselines

3. Protection of Data at Rest: Data at rest refers to the inactive data stored digitally. For protecting such data

  • Understand the different mechanisms of cloud storage and their security implications.
  • Review the options of encryption primitives.
  • Consider Secure Archival and data disposal
  • Tools: Luks, dm-crypt, Gnu-Shred

4. Protection of Data in transit: Data in transit refers to that which is flowing through a public or a private network

  • Always use secure application protocols like the TLS (Transport Layer Security), SSH(Secure Shell), RDP(Remote Desktop Protocol).
  • When the application does not secure protocols for communication, securely Tunnel traffic – IPsEC, SSL VPN, SSH.
  • Consider using a Key Management System.
  • Tools: OpenVPN, OpenSwan.

5. Protection and Patching of Instances

  • Use a Configuration Management System to patch all the cloud-based instances.
  • Look for Zero Days and classify risks.
  • Tools: OpenVAS

6. Protection of Instance Access

  • Manage your access to cloud instances by using a directory service.
  • Create Individual User Accounts(IAM)
  • Based on business needs, grant least privileges.
  • Enable MFA(Multi Factor Auth) for the privileged Users.
  • Audit all the User activities.
  • Refrain from using Root Cloud Accounts.

7. Application protection

  • Get AAA(Authentication, Authorization, and Auditing) implemented.
  • Understand the OWASP Top 10 Security Flaws.
  • Follow the best practices for Secure Development
  • Tools: Jenkins, PMD, FindBugs

8. Auditing and monitoring the cloud

  • Gather the monitoring data in a separate secure network.
  • Establish baselines and monitor all layers and protocols.
  • Deploy IDS(Intrusion Detection System) behind the Network Firewall.
  • Fine tune the alert levels and use redundant channels for alerting.
  • Tools: Nagios, ELK Stack, Watcher, Snort.

9. Validate protection

  • Periodically test the Network, Applications, and Infrastructure for security vulnerabilities.
  • Check for Input validation, Session Manipulation, Authentication and leakage of information.
  • Wherever possible use 3rd party tools.
  • Tools: Metasploit, Kali Linux, OpenVAS.

10. Automation: Automated provisioning helps in the documentation, Disaster Recovery and Planning and change management.

  • Make use of a configuration management system like Chef/Puppet to manage configuration centrally.
  • Consider infrastructure as Code.
  • Implement Continuous Integration and Continuous Delivery.
  • Tools: Docker, Ansibl,e and Chef.

11. Update security policy

  • Define the scope and boundaries of security.
  • Implement proper Risk Assessment Methodology, Identification and Addressing Methodology.
  • Align policies with the contractual obligations of the cloud provider.
  • Make use of the Compliance Management Tools: OpenFISMA, PTA, SOMAP, GLPI.

Conclusion

There are some things that are easier and some things that harder in the Cloud. The steps listed above will, however get you started on your improvement cycle for continuous security. Before you get started and implement a cloud application on grounds of time and cost, it is essential to understand the data and security breach threats.

Whenever an organization is moving to new an application or positioning it, it will either drive the sales up or drive down your operational costs or do them both. By making well-informed choices, cloud computing can offer business value, choice, and litheness to you which will be the most undoubted reasons for implementing a new application on cloud.

... Read More

How Artificial Intelligence reshape the future IT Recruitment

Posted By: Usha Rama Oct-05, 2017 ActivitiesInformativeEngineeringTechTechnology Upgrades
http://image.usharama.edu.in/blog/artificial-intelligence/Ai-01.jpg

Artificial Intelligence reshaping the HR transformation process

The 2016 TechHR enlightened on advances in technology, especially on robotics and Artificial Intelligence, reshaping the HR transformation process tremendously. We all are aware that the transactional HR tasks are concerned and are connected towards the day-to-day mechanics of keeping an organization running are already been syphoned and automated using robots. Here is the entire story.

Traditional HR processes involves a variety of repetitive, administrative tasks that are advent of technology in human resource segment. But today it has brought a major breakthrough in the industry.

There is something called as the ease of doing business or I sincerely believe that this is the comfort way of doing HR. We have unnecessarily made it complicated. However it is too much policy driven,” as per the speakers at Techworks in TechHR 2016 and many more today. Further, the challenge is, how we enable it and what can robotics do to simplify it? “Things which are repetitive in nature fall in similar steps, if we add a layer of automation to it, then it can complete the assigned tasks. That is what we are doing till now.

Recently there has been explosion of data, predictive analytics and statistics. Artificial intelligence is the next giant leap which is going to play a vital role in the HR industry. Role of HR is to understand and anticipate emerging trends in technology and prepare the companies to embrace the same. So, what does Artificial Intelligence mean for HR? And what impact it can bring in; is what everyone are looking forward to know.

While Artificial Intelligence as a permutation and combination of three things - robotics, machine and data all together. How AI can ease HR processes; here is an example - An example of artificial intelligence would be the creation of advance level macro.  It can work at night when no or less employees would be there in the office. It will log into the HRMS by itself and it would identify for whom letter 1 or letter 2 or more has to be scheduled.

It will give print the command and prints will come out on letter heads from the dedicated printers. So that the documents which we would need at very next morning will be kept ready for us; so that the person in need can come and pick those letters and put them in envelope and dispatch. This can be used for easing out the process of providing employment proof and address proof on company letter heads to employees.

The under deterministic logic is how it helps in providing information to employees - for instance, if any female employee would like to know the amount of maternity leave she is eligible for. Then, highly sophisticated technology will tell whether she is eligible for such leave or not. If she is eligible then for how many days will be on the screen.

This is how Artificial intelligence not only simplifies processes but also helps in making logical decisions by removing biases. It can bring in a paradigm shift to areas of HR operations such as learning, development and recruiting. Another significant role of HR managers in the company is to advise the business leaders about the upcoming technologies and the way they can be used to improve employee performances and satisfaction. The conclusion would be that AI plays a pivotal role in driving the organizational change by successfully integrating human and digital workforces pragmatically.

Therefore, HR needs to be more than ready for acceptance of the advanced technological advancements to raise the bar of accuracy and expertise.

... Read More

Telematics for Usage Based Insurance

Posted By: Usha Rama Sep-01, 2017 InformativeEngineeringTechTechnology Upgrades

Insurance telematics and usage based insurance (UBI) programs continue to roll-out across the globe. The rapidly changing landscape is making the insurers select the accurate business models to ensure success. However, many programs are out with a focus on price discounts resulting in a foreseeable race to the bottom; but the industry is moving in the direction to subset the companies to succeed.

The recent innovation by auto insurers — Usage Based Insurance (UBI) is more closely align the driving behaviors of premium rates for auto insurance, while the mileage and driving behaviors are tracked using odometer readings or in-vehicle telecommunication devices such as telematics which are typically self-installed into a particular vehicle port or already integrated equipment installed by car manufacturers. The essential idea of telematics auto insurance is to monitor the driver's behavior directly while the person drives. With the help of these telematics devices, you can measure a number of elements of interest to underwriters—miles were driven; time of day; rapid acceleration, where the vehicle is driven (GPS); hard breaking; hard cornering; and air bag deployment. The level of data collected will generally reflect the telematics technology employed and the policyholders’ readiness to share personal data. Then the insurance company assesses the data and insurance charges or premiums accordingly. For example, when a driver drives long distance at high speed will be charged a higher rate than a driver who drives short distances at minimal speeds. While UBI premiums are collected using a variety of methods, including utilization of the gas pump, direct billing, debit accounts and smart card systems.

About a decade ago, the first UBI programs began to shell in the U.S. and this is the time when Progressive Insurance Company and General Motors Assurance Company (GMAC) began to offer mileage-linked offers through combined GPS technology and cellular systems that tracked miles driven. These offers are still in combined with ancillary benefits such as roadside assistance and vehicle theft recovery. The current accelerations in technology have augmented the efficiency and cost of using telematics, enabling insurers to detain not just how many miles people drive, but when and how they drive too. Such strategies helped to result in the growth of several UBI variations, including Pay-As-You-Drive (PAYD), Pay-As-You-Go, Pay-How-You-Drive (PHYD), and Distance-Based Insurance.

Usage Based Insurance Pricing

The pricing schemes for UBI are included greatly from the traditional auto insurance. Traditional auto insurance relies on actuarial studies of aggregated historical data to create rating factors which include driving record, personal characteristics (age, gender, and marital status), credit-based insurance score, vehicle type, garage location, vehicle use, previous claims, liability limits, and deductibles. You can find premium discounts on traditional auto insurance which are usually limited to the bundling of insurance on several vehicles or types of insurance, insurance with the same transporter, protection devices, driving courses, home-to-work mileage and more.

While policyholders think of traditional auto insurance as a fixed cost, or can be assessed annually by usually paying in lump sums on an annual, semi-annual, or quarterly basis. But, studies show that there is a strong correlation between loss and claim costs and mileage driven, particularly when existing pricing factors defers (such as class and territory). Maybe this can be one reason, many UBI programs seek to change their fixed costs associated with mileage driven costs that can be used in combination with other rating factors in the premium calculation.

Usage-based insurance has an advantage of utilizing individual and current driving behaviors, rather than relying on aggregated information and driving records of past trends and events, making premium pricing more individualized and precise.

Advantages

Usage-based insurance programs offer many advantages to insurers, consumers, and society. The main aim is to link the insurance premiums more closely to actual individual vehicle or fleet performance. This increases affordability for lower-risk drivers, many of whom are also lower-income drivers. It also gives consumers the ability to control their premium costs by incenting them to reduce miles driven and adopt safer driving habits. Fewer miles and safer driving also aid in reducing accidents, congestion, and vehicle emissions, which benefits society.

The usage of telematics help insurers to more accurately estimate accident damages and reduce fraud by enabling them to evaluate the driving data during an accident. In addition, the ancillary safety benefits accessible in conjunction with many telematics based UBI programs also help to lower accident and vehicle theft related costs by improving accident response time, by allowing the stolen vehicles to be tracked and recovered.

Challenges

The challenges like tracking the mileage and behavior information in usage-based insurance programs have raised privacy concerns; as a result, some states have enacted legislation requiring disclosure of tracking practices and devices. Furthermore, some insurers limit the data they collect. This is not for everyone although; but acceptance of sharing the information is growing as more mainstream technology in devices such as smart phones, tablets, and GPS devices. And moreover, social media networks like Facebook and Twitter also enter the market.

Implementing a UBI program, mainly one that utilizes telematics can be expensive and resource-intensive to the insurer. In addition, UBI is an emerging area and therefore there is still much hesitation in selecting and understanding of driving data and how it should be integrated into existing or new price structures to maintain profitability. This is very much important; as the transitioning of lower-risk drivers into usage-based insurance programs offers lower premiums could put pressure on overall insurer profitability.

... Read More

Biomedical Signal Processing

Posted By: Usha Rama Aug-30, 2017 InformativeEngineeringTechTechnology Upgrades
http://image.usharama.edu.in/blog/biomedical-signal-processing/bms-usha-rama-blog.PNG

biomedical-signal-processing

Our bodies constantly communicate the information about our health. Thus, this information can be captured through physiological instruments which measure heart rate, oxygen saturation levels, blood pressure, blood glucose, brain activity, nerve conduction and so on. If we see traditionally, such measures are taken at some specific points in time and can be noted on the patient’s chart. However, the Physicians actually see less than once percent of these values as they make their rounds and proceed to treatment decisions are made based upon these isolated readings.

Introduction

Biomedical signal processing involves in analyzing these measurements to provide accurate information upon which the clinicians can make decisions towards their patients. Today the engineers are discovering new ways to process these signals using a variety of mathematical formulae and algorithms. However, working with traditional bio-measurement tools, and the signals can be computed by a software to provide the physicians with real-time data and greater insights to aid in clinical assessments. Though by using a sophisticated means to analyze what our bodies are saying, we can potentially determine the state of the patient’s health through more non-invasive measures.

Real-Time Monitoring

This is other best way to monitor the patient’s signals and proceed further with their decision. While real-time monitoring can lead to better detection management for chronic diseases, earlier detection of adverse events like heart attacks, strokes, and on time diagnosis of disease.  While the Biomedical Signal Processing is especially useful in critical care settings, where patient data must be analyzed in real-time.

The researchers of the University of Ontario Institute and Technology, are working in conjunction with biggies in technology like IBM, who have created an environment for sophisticated data analysis which helps to read every reading from every medical device in order to support the clinical decision-making task. However, a fully functioning pilot program targeting the neonatal intensive care unit - NICU of The Hospital for Sick Children, Toronto, has been in place since 2009. There, the doctors are researching the use of advanced stream computing software which may help them in early detection and can be a use for the early warning system to alert NICU staff when they have any emergency or life- threatening complications. Presently, the streaming environment procedures are doing 1256 data points per patient each second, through providing constant monitoring of any changes in an infant’s condition as well.

Cloud Computing

Adapting real-time monitoring is one step further, and thus the same thing is being tested by the researchers using the cloud computing to provide advanced specialist support in the rural and remote communities. While the cloud computing approach is currently being tested using data from the neonatal intensive care unit for Women and Infants Hospital in Providence, Rhode Island.

However, providing a remote database also has some implications for telemedicine applications. The Real-time embedded signal processing could be automated onto chips that are part of small, lightweight devices combined into cell phones or worn by patients – (look into see Wearable & Implantable Technologies) who can be observed from home.

A Closed System

We do know that the human body is comprised of several systems working together in a closed loop and thus programmed to preserve life. While a set of points in our brain work to continually monitor and respond to the internal and external influences to regulate body temperature. Though our heart rate varies in response to the autonomic nervous system, which acts as the feedback system, directing the heart to make adjustments as per to the body’s level of exertion. Likewise, as a person begins to become ill and the body reacts very subtly. In this way, everything affects something else, and these effects can be measured or interpreted. By doing complex analyses of the body’s signals, we can discover early indications n how various conditions manifest themselves.

Toward an Understanding of Alzheimer’s

The biomedical signal processing can lead to better and timelier diagnosis and treatment of a disease such as Alzheimer’s. While the researchers are combining EEG readings with other testing parameters to detect patterns that will differentiate Alzheimer’s patients from those with other forms of dementia.

Though they are focusing on the deteriorating synchronization between the left and right sides of the brain, present, a definitive diagnosis can only truly be made after an Alzheimer’s patient has died. But in most cases; they have suspected only after the disease is already in the advanced stage. It is always suggested to have earlier detection which could allow for earlier intervention with drugs that slow the progression of the disease.

Abolishing the Guesswork

Our body is our greatest asset, but the current limitations of science and medicine lead to some sort of guesswork on the part of physicians. Why is this? While the treatments are often employed in a trial-and-error fashion based upon each physician’s experiences with their own patients. But is this the right of treating them is the question now? If we see sometimes doctors do not know whether a patient has gotten better as a result of the body’s own capability to heal itself or through medical intervention. So the advanced way of detecting the right issue and then going forward to treat it is always suggest able. Because guesswork may sometimes help us - but not always though.

Though the expert physician can consider four to seven variables simultaneously, in those some 20 or 30 different situations could be occurring in a patient’s body at once, later what? So the physician makes an educated guess based on his previous experience. So by giving the physician better information, will always lead to making him take better decisions. The more we understand the system, the more we can remove the guesswork.

For example, let us take the lungs, Lung tissue will be just of 5 microns thick. If a patient is on a ventilator, a doctor palpating the patient’s chest through their rib cage cannot probably determine the degree of stiffness of the patient’s lungs. So the amount of ventilation prescribed is estimated and subsequently adjusted based on how that patient responds. Here engineering can help to eliminate—or at least reduce the amount of trial and error which may occur in millions of real-life patients every day.

Knowing the Standards and Protocols

To acquire and understand data from multiple sensors and create meaningful information, specific protocols and standards need to be established and followed. What are these protocols for data acquisition? How they help and how will this data be transmitted and then stored? How can it be processed to provide real-time information about the patient’s health? How can this data be mined to discover new information about a particular patient as well as the population at huge? This is again, the hand of the engineer comes into play. But often times our hands are also tied-up.

However, many biomedical sensors operate using proprietary protocols which are guarded to protect financial interests. While in order to advance our understanding of bodily and disease processes, this will have to change as per the requirements.

In this process, we have a BioSig project which is again an open source software library that provides different software tools for the analysis of many different bio signals; and these tools address issues such as data acquisition, quality control, artifact processing, data visualization and so forth.

Multi-Scale Signal Processing

By taking and examining measurements in vast amounts, engineers are working towards an improved understanding of how physiological systems should work. A lot of effort is currently focused on multi-scale signal processing; looking for features in the measurements that need to be taken at varying scales in order to make more reliable predictions about the whole patient and his or her records.

While the biomedical signal processing includes the entire spectrum of health and wellness. It is a basis of how engineering aids the field of medicine. Again medicine is an empirical field; here both the doctor and medicine work together. However, the Doctors understand medicine based on what they know to be true through their study and practice; and Engineers, on the other hand, focus on trying to fully understand a particular system. Once they truly know the answer, the work in that area will be done.

... Read More