Social Icons

Wednesday, May 13, 2015

US government grants $3 million to fight future cyberattacks

Algorithmic vulnerabilities, or the emerging hacking threat, can do a lot of damage on computer systems. It is considered as more complex, more challenging to detect and more effective at damaging different nation’s computer systems.

Additionally, it is extremely hard to detect with the existing security technology according to the Dyman & Associates Risk Management Projects.

These attacks can only be achieved by hackers hired by nation states which have resources essential to mount them, but perhaps not for very long.

Computer scientists at the University of Utah and University of California, Irvine are given $3 million by the U.S. Department of Defense to produce software that will detect or fight future cyberattacks.

The University of Utah team will be composed of 10 faculty members, postdoctoral and graduate students. Of the $3 million grant, which is over four years, $2 million will go to the Utah team and $1 million to the Irvine team.

The project is funded by the Defense Advanced Research Projects Agency (DARPA) in a new program called STAC, or Space/Time Analysis for Cybersecurity.

The team is tasked with creating an analyzer that can fight so-called algorithmic attacks that target the set of rules or calculations that a computer must follow to solve a problem.

The analyzer needs to perform a mathematical simulation to predict what’s going to happen in case there is an attack and it must conduct an examination of computer programs to detect algorithmic vulnerabilities or “hot spots” in the code. It is more like a spellcheck but for cybersecurity.

University of Utah’s associate professor of computer science and a co-leader on the team, Matt Might said that the military is looking ahead at what’s coming in regards of cybersecurity and it seems like they’re going to be algorithmic attacks. He also stated that the current state of computer security is a lot like doors unlocked into the house so there’s no point getting a ladder and scaling up to an unlocked window on the roof.

"But once all the doors get locked on the ground level, attackers are going to start buying ladders. That's what this next generation of vulnerabilities is all about."

Hackers will make use of programmers’ mistakes while creating their programs on the software. For instance, the software will get a programming input crafted by a hacker and use it without automatically validating it first which can result in a vulnerability giving the hacker access to the computer or causing it to leak information.

Algorithmic attacks are very different since they don’t need to find such conventional vulnerabilities. For instance, they can secretly track how much energy a computer is utilizing and use that information to gather sensitive data that the computer is processing, or they can secretly track how an algorithm is running within a computer. These attacks can also drive central processing unit (CPU) to overwork, or they can disable a computer by forcing it to use too much memory.

Suresh Venkatasubramanian, who is also a co-leader from the team, states that these algorithmic attacks are very devious because they could exploit weaknesses in how resources like space and time are utilized in the algorithm.


Algorithmic attacks are really complex, costly, and use the most amount of time, so most hackers these days are not using this kind of attacks however, they take the easier route of exploiting current vulnerabilities.

Sunday, May 3, 2015

Dyman Associates Risk Management review: Manufacturers Should Upgrade Practices


A new report from Deloitte and the Manufacturers Alliance for Productivity and Innovation recommends that manufacturers convert their risk management practices to "an ongoing conversation rather than a periodic presentation."

The study, titled "Understanding Risk Assessment Practices at Manufacturing Companies," said the evolution of technology within the manufacturing sector presents vulnerabilities as well as opportunities, and that new threats can strike with unprecedented speed.

The report argued companies should improve their use of technology in risk management, consider increasing the frequency of assessments and embed those practices within all levels of company operations.

"In short, risk assessment and management techniques should advance at a rate equal to or greater than the underlying business," the report said.

Companies surveyed by Deloitte and MAPI identified cyber security as the biggest IT risk three years from now, with product design and development innovation as the top business risk over that span. The report said companies should utilize cyber security controls, but that they should also increase their insight into potential threats and how to appropriately respond to them.

They study also noted that 93 percent of companies indicated oversight of their risk management rested with the full board or an audit committee, and suggested that "given the rising complexity facing most manufacturing organizations ... it may be time to give risk management a clear subcommittee."

The involvement of a committee, meanwhile, could result in such panels becoming increasingly involved in day-to-day operations. The report called for a "proper executive champion" for that role, potentially including the creation of a chief risk officer.

Improved risk management and audit practices, meanwhile, could also help create a more resilient supply chain, as well as improve employee recruitment and retention amid ongoing concerns about a manufacturing skills gap.

Although improving risk management practices wouldn't dramatically alter a company’s bottom line, the report said the potential benefit to competitive advantages and shareholder confidence "will naturally make its way into earnings."

"Organizations should establish a risk assessment program that fits into its unique culture and risks," said MAPI deputy general counsel Les Miller. "Since change is constant and can occur suddenly, ongoing efforts to enhance the sophistication and variety of risk assessment techniques are needed."

The study conducted an online poll of 68 members of MAPI's Internal Audit and Risk Management Councils in June of 2014. The respondents ranged from less than $1 billion in annual revenue to more than $25 billion; the majority ranged between $1 billion and $10 billion.


Wednesday, April 29, 2015

Dyman Associates Risk Management Review : The Unfolding Role of Risk Managers -- New Demands, New Talent

Melissa Sexton, CFA is the head of Product and Investment Risk for Morgan Stanley MS +1.21% Wealth Management. Prior to this, she spent nearly a decade serving as Chief Risk Officer at two different hedge funds in New York. Most of Melissa’s 25 years of experience has been in a variety of risk management roles, though she has also traded derivatives and worked in operations, and has continuously worked on projects which integrate risk management with information technology. Ms. Sexton is a member of PRMIA New York’s steering committee, received a BA in Mathematics and Economics from Boston University, and was awarded her CFA charter in 2001.

Christopher Skroupa: You started your career in risk management in the 1990s, a decade notable for rapid changes in information technology combined with extraordinary growth and development of financial products. How have these changes affected the risk management function over your career?

Melissa Sexton: The changes have been significant and continue to be. When I started in the field, the most sophisticated financial instrument was an exchange-traded option – a standardized product with fully transparent pricing and contract terms. Software for standardized products can be commoditized and developed fairly quickly, but products with multiple triggers and non-standard underlyings meant that technology and risk models needed to be flexible and much more complex. And risk managers needed to be knowledgeable not only about valuation models and the nuances of different financial markets, but needed to have more of an enterprise view of risk. The risk function in the early nineties was largely focused on managing market and credit risks, but the massive growth of over-the-counter (OTC) derivatives, also known as off-exchange trading, led to increased counterparty, operational and liquidity risks. It also led to a need for enhanced Know your Customer (KYC) controls, which support a business in verifying the identity of its clients, to manage reputational risk.

Skroupa: Can you compare and contrast your previous role of chief risk officer at a hedge fund with your current role managing investment and product risk at a large, complex organization like Morgan Stanley Wealth Management?

Sexton: In many ways, the roles are quite similar because most risk management positions require a blend of quantitative and financial expertise, technology and communication skills. It will always be essential that risk managers are able to influence behavior. But the biggest difference I experienced while working at hedge funds was the emphasis on stress testing and liquidity risk management – both fund liquidity and asset liquidity. This is because of the higher leverage employed in most hedge fund strategies and the prevalent use during the financial crisis of gate provisions, which limited the amounts clients could withdraw from funds. I worked closely with clients during this hectic period which gave me insights into their unique needs and circumstances.

At Morgan Stanley Wealth Management (MSWM), we are also focused on individual client needs and circumstances, but the size and scale of this business differs materially. With more than 16,000 financial advisors and approximately $2 trillion in client assets, we need to focus on clients and their accounts, but also financial advisors, financial markets and the multitude of investment products and solutions we offer. Continue reading…


Monday, April 6, 2015

Dyman & Associates Risk Management Projects: New Chip can Turn Smartphone into 3D Scanner

With 3D printers all but widely-known now, it only remains to have an accurate and portable 3D scanner to practically produce anything on-the-go. The current 3D scanners are all bulky and very expensive but we may soon have that functionality installed in our smartphones.

A team of CalTech researchers led by Ali Hajimiri has designed a small camera chip that can enable a smartphone to do an accurate 3D scan of an object.

The tiny silicon chip called nanophotonic coherent imager (NCI) only measures one millimeter square and can conveniently be placed within smartphones. It uses a type of Light Detection and Ranging (LIDAR) technology in capturing an item's width, depth and height. Basically, a laser is shined on the object so the light waves that bounce off of it can serve as guide for the imager when capturing the measurement data.

The technology used on the chip is further explained by Caltech:

"Such high-res images and data provided by the NCI are made possible because of an optical concept known as 'coherence'. If two light waves are coherent, the waves have the same frequency, and the peaks and troughs of light waves are exactly aligned with one another. In the NCI, the object is illuminated with this coherent light. The light that is reflected off of the object is then picked up by on-chip detectors, called grating couplers, that serve as 'pixels', as the light detected from each coupler represents one pixel on the 3-D image."

According to Dyman & Associates Risk Management Projects, LIDAR technology is commonly used in self-driving cars, robots and precision missile systems due to its effectiveness in identifying locations and objects. Although the concept of LIDAR is not that new, their idea of having "an array of tiny LIDARs on our coherent imager can simultaneously image different parts of an object without the need for any mechanical movement" is a novel one.

Basically, every pixel on the sensor can separately assess the intensity, frequency and phase of the reflected waves, thereby creating a piece of 3D information. The combination of all those pieces of 3D data from all the pixels results in the full 3D scan.

Caltech's concept allows for the development of a tiny and relatively cheap scanner without sacrificing the accuracy. Dyman & Associates Risk Management Projects reported that the new chip can create scans that closely resemble the original within microns.

At present, the prototype Caltech has made only has 16 pixels on it, just enough to scan small objects such as coins, but they are reportedly working on scaling it up to thousands of pixels.

Tuesday, February 17, 2015

Dyman & Associates Risk Management Projects: 10M Passwords Publicized For Research

We've all heard of this before: a hacker releasing a certain number of passwords and usernames, presumably just for the lulz. But this time, we're talking about 10 million records posted by no less than a security specialist himself.

Security expert Mark Burnett has published 10 million sets of usernames and passwords online in an effort to equip the security sector with more information, while also getting himself potentially tagged as a criminal.

He clarified that his release of the username-password list is solely for white-hat purposes -- to aid research in making login authentications more effective and fraud-proof. Burnett insisted that he does not intend to help facilitate any illegal activity or defraud people by his actions.

"I could have released this data anonymously like everyone else does but why should I have to? I clearly have no criminal intent here. It is beyond all reason that any researcher, student, or journalist have to be afraid of law enforcement agencies that are supposed to be protecting us instead of trying to find ways to use the laws against us," he said in his post.

Leaking a massive amount of user data into the wild certainly does not sound like great help for most people but for security professionals, it's an important tool for research. For instance, how else would they know that online users are generally bad at choosing passwords?

In his post, he shared that he would often get requests for his password data from researchers but he would just decline them before. But since he also know its importance, he decided to publish a clean data set for the public.

"A carefully-selected set of data provides great insight into user behavior and is valuable for furthering password security. So I built a data set of ten million usernames and passwords that I am releasing to the public domain."

To be fair, Dyman & Associates Risk Management Projects confirms that analyzing a username-password set seems to be more helpful for the security researchers.

According to him, it was by no means an easy decision but he eventually posted it after weighing down a number of factors. And though Burnett said he believes most of the data are already expired and unused, the domain part of the logins and any keyword that could link it to a certain site were still removed to make it difficult for those with criminal intent.

Besides, Dyman & Associates Risk Management Projects experts agreed with him in saying that if a hacker would need such a list in order to attack someone, he's not going to be much of a threat.

Burnett has previously helped in collecting the recent list of worst passwords to alarm people into adopting better practices when it comes to their login credentials.

Lastly, he imparted the following warning for complacent users: "Be aware that if your password is not on this list that means nothing. This is a random sampling of thousands of dumps consisting of upwards to a billion passwords."