A survey of more than 1,200 risk managers and corporate insurance experts in over 50 countries identified business interruption as the top concern for 2017. According to the sixth annual Allianz Risk Barometer of top business risks, this is the fifth successive year that business interruption has been seen as the biggest risk.
“Companies worldwide are bracing for a year of uncertainty,” Chris Fischer Hirs, CEO of AGCS said in a statement. “They are concerned about rather unpredictable changes in the legal, geopolitical and market environment around the world. A range of new risks are emerging beyond the perennial perils of fire and natural catastrophes and require re-thinking of current monitoring and risk management tools.”
While natural disasters and fires are what businesses fear most, non-damage events such as a cyber incident, terrorism or political violence resulting in denial of access are moving higher up on the scale, according to the report. These types of incidents can cause large loss of income to companies, without actual physical loss.
The second concern, market developments, could result from stagnant markets or M&As, or from digitalization and use of new technologies.
CyberRisk, third on the list of perils, has jumped up from 15th place in just four years. Cyber was identified as the second concern in the United States and Europe.
Technology is changing the way we interact with the world, whether it is newer and smarter vehicles, a greater ease of communication, or an increase in the data available to help companies better target their marketing towards consumers. The rewards of these technological leaps, however, also come with some rather large risks, or at the very least create issues that have yet to be addressed.
The auto insurance industry is as vulnerable to these risks as any other industry. However, here are three major trends in the auto insurance industry likely to be of concern in the coming months. Technology has the potential to disrupt the insurance industry in a short period of time and companies must determine how best to adapt to the ever-changing technological landscape and use it to their advantage.
Automated vehicle technology
Automated driving is a broad concept, encompassing a myriad of technologies, some currently in production and others still in the imaginations of the world’s top creative engineers. The concept of autonomous cars is constantly evolving and has not reached full automation. However, further development of automated functionalities is inevitable.
While there are many benefits to automation, the technology is advancing faster than the legal and regulatory environments that govern it. Are insurers prepared for the impact of automated driving technology on the auto insurance industry?
One of the major benefits of automated vehicle technology is its ability to reduce the frequency and severity of auto accidents. This benefit will impact both the need for liability insurance and the number of claims submitted under insurance policies. As these technologies become mainstream, the personal lines auto insurance industry is projected to shrink by as much as 60 percent. A wholesale reevaluation of what is covered under auto insurance policies will be needed long-term. More immediately, insurers will be forced to address how automated technologies impact discounts, underwriting, apportionment of liability and claims handling procedures.
Another concern involves its propensity to be hacked. New technology creates unique cyber vulnerabilities that are unprecedented in the auto industry. The data connections and sensors critical to automated driving technology open the door for hackers to conduct cyberattacks that can compromise the security of the data collected, as well as the safety and operation of the vehicle itself.
Attacks that threaten the integrity of a vehicle’s safety operations pose a significant risk to personal safety, but interception of the data collected by the vehicles presents an equally significant economic risk. Who is responsible for accidents that are caused when a vehicle is hacked? Is it the driver, the manufacturer of the vehicle, or just the hacker? Will insurance coverage be modified to cover cyberattacks? If so, under what circumstances? These hypothetical questions are largely unanswered at the moment, but as the technology emerges, insurance companies will be forced to address them.
Technology is not just affecting the automotive industry, but virtually every facet of our modern economy, and no consumer technology is more ubiquitous in American culture than the smartphone. At any given moment, approximately 660,000 drivers are using cell phones or electronic devices while driving, drawing greater attention to distracted driving.
Plaintiffs in personal injury claims arising from auto accidents have sought to place blame for distracted driving on the deeper-pocket cell phone manufacturers and app developers. Although many of these cases have had little success, some are gaining traction. For example, a car accident victim sued Snapchat, alleging it was liable for an accident caused by a driver who was using the app’s “speed filter,” which allows a driver to include a speed along with the driver’s picture. The plaintiff alleged that the “speed filter” not only distracted the driver, but also encouraged her to speed. The outcome of the case remains unclear, but if the plaintiff is successful, the decision could harken a wave of lawsuits against cell phone manufacturers and mobile app developers for damage caused by distracted drivers.
Insurance companies should pay attention to how the law develops on this topic since it could provide insurers with a mechanism for seeking contribution or indemnification from larger entities like cell phone manufacturers or mobile app developers.
Insurers’ reliance on big data
Although not a new trend, 2016 showed a continuing reliance on “big data,” or large amounts of consumer information culled from various locations and compiled into a neat package. Insurance companies use big data to engage in predictive modeling, particularly in underwriting, pricing and claims handling.
Big data can help with price optimization by being able to drill down into more specific segments of the population, or even marketing to particular individuals based on their consumer activity.
However, the downside for insurance companies is the potential for discrimination claims as demographics are used to drive pricing of certain insurance products and the claims arising under those policies. In particular, regulators are concerned that certain big data modeling factors may be correlated with prohibited rating factors. Insurance regulators will continue to focus on regulating the use of big data and price optimization going forward.
Each of these risks will affect how insurers write auto coverage going forward. Considering their impacts today though, will allow them to develop products that continue to protect policyholders well into the future.
Everyone pays a price for complexity, whether in the form of reduced profits in a business, diminished efficiency in the public sector, or excessive cost imposed on consumers, who must pay more for products that are produced through unnecessarily complex means. The cost of complexity can be difficult to identify, however, because it defies measurement both in terms of direct costs and outcomes. More often than not it simply becomes a source of friction that slows everything down and takes a silent toll. Missed opportunities, reduced consumer satisfaction, slower growth and foregone investment are among the most common results of unnecessary complexity.
Examples of such complexity abound, whether in the form of multiple layers of required approvals in government, excessive and overlapping layers of production in businesses, and even on an individual level, such as numerous stops at security checkpoints at airports. We have all asked ourselves why things need to be so complicated, and how much money and resources must be wasted in the process of maintaining “normal” operating conditions. Justifying such inefficiencies through the status quo argument of “that’s the way it’s always been done” demonstrates capitulation while negating just how much the world has changed.
Status quo and decision avoidance have become the principal outcome of most committees; research has shown that an average of five people are involved in the decision-making process—particularly those that require agreement about change or procurement. The friction created by such unnecessary complexity is akin to that generated by corruption—both exact a tremendous price for individuals, businesses and societies, yet with so many people guilty of being accessories to complexity and friction, we are collectively deincentivized to devote the energy and resources required to move toward more efficient systems.
Some forms of lending and insurance that are intended to catalyze trade and investment flows end up contributing to what often become the most complex and time-consuming transactions in financial markets. For example, project finance transactions typically add unnecessary “drag” to what should be a streamlined process. As a result, a significant percentage of global trade and investment skew to larger firms that can afford to throw an army of lawyers, analysts and advisors at a transaction, adding many months and millions of dollars to the process. The average project finance transaction can take more than a year to complete, involve dozens of underwriters, lenders, lawyers and advisors, and require endless phone calls, meetings and document revisions—all sources of friction that greatly contribute to inflated prices, high opportunity costs and unnecessary complexity.
The time, effort and cost required to accommodate all these parties putting their thumbprint on agreements and ensuring that every conceivable contingency is accounted for can be an insidious source of friction. While intending to “de-risk” a deal, it ends up costing consumers much more than they would otherwise need to pay because of sunk costs, while scaring away would-be trade or investment partners in the process. If a single representative from each organization were simply assigned to each task, and a hard deadline was attached to the process (along with a monetary penalty for failing to comply), the time required to reach financial closure could surely be reduced by months, saving millions of dollars.
Another area where friction and complexity exact a heavy toll on economic growth is in B2B sales. On one end, legions of sales professionals armed with the latest sales tactics promote their products and services. Being prodded by a pervasive penalty-driven management culture, sales incentives are most often geared toward maximizing short term results, ignoring longer-term risks, and deemphasizing the development of solutions that genuinely add value to buyers. On the demand side of the equation is the “valley of decision avoidance,” where those five decision-making purchase committee members are reluctant to drive change, despite implied economic and other advantages. While in the end, transactions are completed in this environment, the time, economic drag and complexity added to the business cycle is detrimental to economic growth and the velocity of trade and investment, which has the pernicious effect of dampening hiring, generating inventory and eroding general economic confidence.
Companies that do not buy from each other—or trade and invest with the rest of the world—are as detrimental to global growth as firms that take months to enact a purchase decision in their home markets. Red tape, bureaucracy and decision avoidance are the advanced economies’ counterpart to the corruption, bribery and fraud that plague many developing and emerging markets. But they can be defeated with nimbler decision-making, efficient processes and the will to overcome risk-aversion.
Since we are deincentivized to implement fundamental reforms that would change organizational motives to unclog the insidious obstacles to efficiency, change must be fostered from the bottom up. The vast size, geographic spread and multiple organizational layers of giant “pace-setter” firms are all too often originating sources of complexity and friction. These same firms increasingly fall prey to what are, in their essence, avoidable risks that result in strategic disruption, particularly as new business models emerge that exploit their inability to move with agility.
Similarly, overly burdensome regulatory standards that are meant to control risk contribute to this process, often producing arbitrage opportunities in how capital, taxation and data flows in the global economy. Moving away from complexity and friction must begin with an acknowledgement of the adverse effects and relationship between these forces. Only then can we begin to embrace simplicity and reduce friction on the path to economic growth.
The National Association of Insurance Commissioners issued the following news release:
Members of the National Association of Insurance Commissioners (NAIC) voted to increase the number of NAIC funded consumer representatives from 20 to 22 for 2017.
“The NAIC Consumer Liaison Program is essential to our work to educate and protect consumers,” said NAIC President-Elect and Wisconsin Insurance Commissioner Ted Nickel. “Consumer voices help us shape model regulations and laws ultimately benefitting all U.S.consumers. We appreciate the input these representatives have shared through the years and look forward to celebrating the program’s 25th anniversary next year.”
The Consumer Liaison Program (http://www.naic.org/consumer_participation.htm) was established in 1992 to encourage consumer representation and participation in NAIC meetings. The NAIC provides reimbursement for travel expenses for qualified funded consumer representatives to enable their participation.
The application process (http://www.naic.org/Releases/2016_docs/naic_seeks_2017_consumer_liaison_representatives.htm) for 2017 consumer representatives is currently in process and representatives will be announced early next year.
About the NAIC
The National Association of Insurance Commissioners (NAIC) is the U.S. standard-setting and regulatory support organization created and governed by the chief insurance regulators from the 50 states, the District of Columbia and five U.S. territories. Through the NAIC, state insurance regulators establish standards and best practices, conduct peer review, and coordinate their regulatory oversight. NAIC staff supports these efforts and represents the collective views of state regulators domestically and internationally. NAIC members, together with the central resources of the NAIC, form the national system of state-based insurance regulation in the U.S. For more information, visit www.naic.org.
For many, buying a home is both a major life milestone and a significant financial burden.
Homeowners, who must contend with all manner of accidents, decay, natural disasters and wildlife, look to insurance providers to protect their investments.
Thus far, the 2016 Atlantic hurricane season is the most active and costly since Hurricane Sandy in 2012, responsible for more than $8.65 billion in damages. For insurance providers, understanding the dynamics of a storm and the geography of the projected impact zone is essential to fundamental business operations during the hurricane season.
For example, home insurance providers need to know which of their insured properties, commercial and residential, exist in the path of a hurricane to determine whether the provider has enough reserves to offset potential losses. In the case of Hurricane Matthew, over 5.5 million homes and $1.4 trillion worth of property lay in the cone of the storm, according to Pitney Bowes geolocation data.
As Hurricane Matthew rolled toward Florida earlier this week, the meteorologists at Verisk Analytics were tracking its every move.
Location intelligence, or the enrichment and analysis of location data for enhanced business insight, allows insurance providers to access the necessary data to make these critical business decisions before, during and after a hurricane hits.
The calm before the storm
From a business standpoint, insurance companies examine prospective new customers closely for their proximity to a forthcoming tropical storm. Specifically, they are looking to ensure a reasonable amount of payments will occur to offset any potential damages in any specific timeframe.
Keeping accurate data on policyholders helps to validate claims after a storm and enable the insurer to deliver more efficient service. How does it work in practice?
Homeowners will typically reach out to one or more insurance companies to answer a series of questions on a new property. These companies will then utilize location intelligence technology and data to examine the property using high precision geocoding and through appending descriptive attributes to a specific property location.
Insurance providers evaluate a series of spatial queries combined with historical data to quantify the potential risk that the property exposes them to (e.g. the property is on the water, has a high risk of flooding or resides in an area prone to earthquakes). In a normal course of business, that data is used to develop a risk profile which directly informs insurance rates.
While the fire is hot
During the hurricane, insurance providers turn to location intelligence to track the impending damage in real-time. Location intelligence allows providers to continuously update their damage assessment based on risk datasets that feed into a high-quality model of the storm’s path.
Information-based models better assess the risk of hurricane damage by neighborhood, so insurance companies can price products correctly to avoid unnecessary risk and yet still serve policyholders. The accuracy of the data feeding into these models is of the utmost importance: The difference between several thousands of dollars in over or underpriced premiums could boil down to the exact borders of a particular flood zone.
The model is primarily used to forecast the hurricane’s intensity, track, storm surge and projected rainfall, however, other factors contribute to the overall damage assessment. Insurance companies can then use these real-time models to make timely, actionable decisions to minimize its risk exposure, pre-position resources and reduce unnecessary expenditures.
As Hurricane Matthew rolled toward Florida earlier this week, the meteorologists at Verisk Analytics were tracking its every move.
Ultimately, in the aftermath of a hurricane, location intelligence can be key when calculating the return-on-investment. For example, Florida Farm Bureau Insurance operates in a state statistically likely to sustain damage from 50 percent of all hurricanes that occur in the United States.
When projecting the potential costs incurred from a storm, Florida Farm Bureau uses geocoding to efficiently and accurately determine rates for its customers, as well as decrease the amount of time and effort involved in the underwriting process. The result is reduced operating expenses and greater profitability — specifically, a 900 percent return-on-investment in the first 10 months following the implementation of location intelligence technology.
Because Florida Farm Bureau was able to ensure the accuracy of the information in its databases, they retained more customers through the policy renewal process. Florida Farm Bureau is also realizing fewer fines and criticisms from regulatory agencies, while eliminating many of the manual processes that agents previously used to validate policyholder information, according to a Pitney Bowes Florida Farm Bureau case study.
To the everyday homeowner, during hurricane season, insurance rates are an afterthought to immediate safety concerns. Human Resource departments can even use location intelligence to track their employees and send contextually relevant communications prior to or during a major storm event. However, to insurance companies, hurricanes represent an unpredictable business liability that can be managed through the use of this innovative technology.
Clarence Hempfield is vice president of product management, location intelligence for Pitney Bowes Software. Connect with him on LinkedIn.
Source: Pitney Bowes Risk Data Suite
Governmental self-insured employers are defined in Section 440.38(6), F.S. If a governmental employer meets this definition and has submitted an application to self-insure for workers’ compensation, it shall be deemed self-insured under the terms of this chapter unless they elect to procure and maintain insurance through the private market.
Below is important information regarding the self-insurance regulatory process, assessment rates, annual maximum compensation rates, EDI filings (medical, indemnity and policy information), audit, and other carrier regulated activities.
Following the presidential election workers compensation legal experts say they are in wait-and-see mode with regard to the recent uptick in U.S. Equal Employment Opportunity Commission complaints.
“Everything with the EEOC may be off because of the election results this week,” said Chicago-based Jeff Nowak, co-chair of the labor and employment practice group with Franczek Radelet P.C., in a webinar hosted Thursday by the Disability Management Employer Coalition on Thursday.
The webinar focused specifically on how workers comp intersects with the Americans with Disabilities Act Amendments Act and the Family and Medical Leave Act, and provided tips on how to avoid penalties, fines and lawsuits.
Mr. Nowak, who said he is watching whether the EEOC will be the priority for Donald Trump’s administration that it has been for other administrations, said a number of pending complaints and rulings are highlighting the difficulty for employers who want to both comply with federal laws and get injured employees back to work.
Adopted in 2008, the ADAAA amended the ADA, which essentially bars employers from discriminating against people with disabilities in any aspect of employment-related activities. Under the FMLA, employers are required to give their employees up to 12 weeks of unpaid leave per year for specific reasons, including a serious health condition or to care for an immediate family member who has a serious health condition. According to experts, the two can intersect with workers comp and create EEOC complaints.
Panelists highlighted communication with employees as a better way to smooth the path to avoids costly litigation.
Dubbed the “interactive process” in worker comp, this communication requirement calls for each side of a claim to exchange of information in good faith. It is key to implementing ADAAA and FMLA requirements and state regulations that govern workers comp, experts say.
Communication is “the most critical element,” said Adrienne Paler, director of total health and productivity management, integrated disability and absence management for Sutter Health, a Sacramento, California-based health system. “We need to tailor it … what is the employee asking for? That’s a question to ask the employee.”
“The risk of litigation increases when an employer is ignorant about their obligations under ADAAA to engage in the interactive process … they are unwilling to think creatively what they can do to that job, what changes can they make to help the employee perform the job,” said Mr. Nowak.