This morning, Athena Alliance published a new Working Paper: Frameworks for Measuring Innovation: Initial Approaches by Susan Rose, Stephanie Shipp, Bhavya Lal and Alexandra Stone of the Science and Technology Policy Institute. This paper utilizes and builds on the discussion of innovation published in an earlier report by the authors for the BEA – Measuring Innovation and Intangibles: A Business Perspective (also available on the Athena Alliance website).
We know that innovation is a key driver of economic growth. As such, governments and private firms seek to foster and manage innovative activity. However, our understanding of innovation, including our measurement ability, is still not adequate. As the Commerce Department’s Advisory Committee on Measuring Innovation in the 21st Century Economy noted last year, we need “a stronger framework for identifying and measuring innovation in the national economy.” In this paper, the authors seek to answer that challenge.
The report begins with a working definition of innovation that includes 10 attributes:
- Attribute 1: Innovation involves the combination of inputs in the creation of outputs.
- Attribute 2: Inputs to innovation can be tangible and intangible.
- Attribute 3: Knowledge is a key input to innovation.
- Attribute 4: The inputs to innovation are assets.
- Attribute 5: Innovation involves activity for the purpose of creating economic value.
- Attribute 6: The process of innovation is complex.
- Attribute 7: Innovation involves risk.
- Attribute 8: The outputs in innovation are unpredictable.
- Attribute 9: Knowledge is a key output of innovation.
- Attribute 10: Innovation involves research, development, and commercialization.
One of the important points under attribute 4 is how intangible & tangible assets relate to innovation. The authors offer the following diagram (as also elaborated more on in the companion piece, Measuring Innovation and Intangibles: A Business Perspective).
Intangibles are not innovation — but are inputs into the innovation process and are outputs from innovation. That is an important point to keep in mind.
The report then proposes two frameworks for measuring innovation:
- Framework 1 – Measures innovation activity by measuring the intangible assets that are created by and fed back into the innovation process at the firm or organizational level, which can then be scaled to the national level.
- Framework 2 – Measures innovation investments, especially the broader investments that set the stage for innovation.
The report goes on to provide an illustrative set of data sources for both frameworks, which demonstrates that appropriate data can be collected. Some of that data is already collected by the government, such as the National Science Foundation’s data on R&D spending. Others are collected by private sources, such as Computer Economics data on information technology (IT) spending, staffing, and technology trends. Other data, such as the organizational capital embodied in design and prototyping, can be proxied from other data, in this case, by the revenues of engineering and design firms as collected by Census.
As the authors point out, the choice of framework used depends on the goal of the exercise. If the goal is to understand which parts of innovation (for example, R&D or alliances) contribute to growth and to understand the process, the first framework is more useful. Innovation researchers would prefer this framework because it would provide more detailed insight into the innovation black box.
The second framework is the one most able to capture the basic investments contributing to productivity and growth. This approach is much more fundamental and flexible in that it encompasses all innovative activities, even those that are not now known.
I am especially happy that Athena was able to publish this report. The frameworks presented in this report provide an important guide for future research, especially in the development of future data sources.
As all (sports) eyes converge on Detroit this weekend for the NCAA Final Four, all (economic) eyes will also be converging on the city’s “brand” industry: autos. Today, the White House will announce details of its auto industry survival package. But leaks and actions have already outlined much of the plan, including the firing GM head Rick Wagoner (see stories in the Washington Post, New York Times, the Wall Street Journal and Politico). The Administration has rejected the company submitted plans and essentially said “try again.” According to the White House fact sheet:
- Viability of Existing Plans: The plans submitted by GM and Chrysler on February 17, 2009 did not establish a credible path to viability. In their current form, they are not sufficient to justify a substantial new investment of taxpayer resources. Each will have a set period of time and an adequate amount of working capital to establish a new strategy for long-term economic viability.
- General Motors: While GM’s current plan is not viable, the Administration is confident that with a more fundamental restructuring, GM will emerge from this process as a stronger more competitive business. This process will include leadership changes at GM and an increased effort by the U.S. Treasury and outside advisors to assist with the company’s restructuring effort. Rick Wagoner is stepping aside as Chairman and CEO. In this context, the Administration will provide GM with working capital for 60 days to develop a more aggressive restructuring plan and a credible strategy to implement such a plan. The Administration will stand behind GM’s restructuring effort.
- Chrysler: After extensive consultation with financial and industry experts, the Administration has reluctantly concluded that Chrysler is not viable as a stand-alone company. However, Chrysler has reached an understanding with Fiat that could be the basis of a path to viability. Fiat is prepared to transfer valuable technology to Chrysler and, after extensive consultation with the Administration, has committed to building new fuel efficient cars and engines in U.S. factories. At the same time, however, there are substantial hurdles to overcome before this deal can become a reality. Therefore, the Administration will provide Chrysler with working capital for 30 days to conclude a definitive agreement with Fiat and secure the support of necessary stakeholders. If successful, the government will consider investing up to the additional $6 billion requested by Chrysler to help this partnership succeed. If an agreement is not reached, the government will not invest any additional taxpayer funds in Chrysler.
- A Fresh Start to Implement Aggressive Restructurings: While Chrysler and GM are different companies with different paths forward, both have unsustainable liabilities and both need a fresh start. Their best chance at success may well require utilizing the bankruptcy code in a quick and surgical way. Unlike a liquidation, where a company is broken up and sold off, or a conventional bankruptcy, where a company can get mired in litigation for several years, a structured bankruptcy process – if needed here – would be a tool to make it easier for General Motors and Chrysler to clear away old liabilities so they can get on a path to success while they keep making cars and providing jobs in our economy.
- A Commitment to Consumer Warrantees: The Administration will stand behind new cars purchased from GM or Chrysler during this period through an innovative warrantee commitment program.
- Appointment of a Director of Auto Recovery: The Administration also announced that Edward Montgomery, a top labor economist and former Deputy Secretary of Labor, will serve as Director of Recovery for Auto Workers and Communities. Dr. Montgomery will work to leverage all resources of government to support the workers, communities and regions that rely on the American auto industry.
(See also other key documents: the Warrantee Commitment Program, the GM Viability Assessment and the Chrysler Viability Assessment.)
The analysis touches on a couple of points I’ve made earlier. Concerning technology, the assessment views the technology transfer from Fiat to Chrysler as a positive step. However, even with the Fiat deal, “Given Chrysler’s limited financial resources, it can not make the necessary catch-up investments in R&D required to refresh its portfolio and bring it up to par with its competitors.” As damning, “Chrysler also lags its competitors in terms of manufacturing flexibility.”
The assessment also sees GM’s movement toward green technologies as positive. However, with respect to the Volt:
GM is at least one generation behind Toyota on advanced, “green” powertrain development. In an attempt to leapfrog Toyota, GM has devoted significant resources to the Chevy Volt. While the Volt holds promise, it is currently projected to be much more expensive than its gasoline-fueled peers and will likely need substantial reductions in manufacturing cost in order to become commercially viable.
They also see GM getting rid of Saab, Saturn and Hummer as a positive. As I’ve argued before, I hope that Saturn is spun-off and not shut down. It started as an innovative company and should be given a chance to fulfill that potential. (In case you missed it, Saturn has been running “we are still here” ads during the NCAA basketball games.)
The last part of the plan may be the most telling. It appoints a Director of Recovery for Auto Workers and Communities changed with dealing with the economic dislocation. As I’ve noted before, the auto industry task force has to missions: creating a new industry and mitigating the effect of the demise of the old industry. In my earlier comments, I thought those two missions were an either/or. Now I am beginning to believe they are really a both/and. Creating the new industry will require mitigating the negatives. It sounds like auto taskforce has laid out a plan for that latter. It remains to be seen if they can pull off the former.
By the way, on the Final Four: Go MSU! As a diehard Wolverine, giving any credit to my across state rival is tough. But Big 10, and especially Michigander loyalty wins out.
This morning I heard distinguished speakers once again repeat a questionable statement: that green technology will save American manufacturing.
Why do people think that the US is going to walk into the world market and outcompete everyone in a technology that is wide spread – and where, in some case, we are already behind?
Of course to revive manufacturing we will need to make products that the rest of the world wants to buy. Of course hybrid/alternative fuel vehicles and green technologies are important for both economic and environmental reasons.
But the ultimate question is not just what we make. It is how we make things. If we don’t change how we make things, all the green technologies we developer here in the US will be end up being made elsewhere. That is a topic that some on Capitol Hill and elsewhere are focusing on (see earlier posting).
Part of the answer to that question is production costs. And part of US production costs are health case costs. Solving the health care cost issue will help solve the manufacturing.
Another part of the answer is the production process (and issue of productivity). The production process has changed over the past few decades. It has become more knowledge and intangible intensive. It has become more collaborative. The old categories of manufacturing and services are becoming fused (see earlier posting).
On the issue of collaboration, new technologies (i.e. cloud computing and virtual worlds) and organizational structures are driving the changes. As we argued in Virtual Worlds and the Transformation of Business:
Government policy should focus on the fact that the U.S. will compete based on its ability to develop collaborative skills, not traditional business skills. Innovative policies should help corporations bring in social networking practices. Changes in the tax code could encourage investment in collaboration skills, networks of collaborative enterprises, and a new collaborative infrastructure. The federal government and states should also promote policies to promote faster development of cloud computing, scalable data storage, and open networks. They should also develop innovative training programs that educate businesses and employees about how to use collaborative technologies and integrate them into traditional disciplines.
In other words, we need a policy built on the new realities of the production process.
The folks over at Marketmerge have published an interesting report on the role of IP in M&A activity. According to the survey of both private equity and corporate players, IP is becoming a more important factor in M&A transactions. But, issues of valuation are of concern. As the report notes:
respondents believe IP value is not fully reflected in traditional valuation methods like cash flow projections. Respondents rated exposure to patent litigation, freedom to operate and strength in key markets highest (at least 4 out of 5) in terms of importance, and yet all of these factors are overlooked or not readily incorporated by traditional valuation models. The failure of traditional valuation models to capture the unique features of IP assets and risks contributed to the particularly strong dissatisfaction of private equity respondents with current valuation techniques.
In other words, it is important to include intangibles (infringement risk, freedom to operate) in the valuation process for other intangibles (IP). This need to look at the intangibles of intangibles illustrates just how hard the valuation process really is. It is not clear to me that financial valuation models can ever incorporate all these factors. Better disclosure is probably the real goal.
Interestingly, there is a split between private equity and corporate responses to the problems of disclosure. Private equity respondents saw the failure to identify IP risks as a major problem in due diligence, whereas corporate respondents were more concerned about insufficient due diligence resulting in the failure to identified IP opportunities. That split probably reflect the differing motivations — investment versus operation. Each perspective has a slightly different take on what information is most important in the disclosure (due diligence) process.
PS – thanks to Mary Adams over at the blog IC Knowledge Center for bringing this to our attention.
Yesterday’s speech by President Obama on clean energy also touched on general issues of technology policy. One of which was that the proposed budget makes the R&D tax credit (more correctly called the Research and Experimentation Tax Credit) permanent. As the White House press release points out, “The credit has been extended 13 times with some extensions lasting just 6 months, and has also been allowed to lapse for almost a year – undermining its effectiveness because companies can’t count on it.”
This is not the first time budgets have been proposed that makes the R&D tax credit permanent. Every time, Congress has decided that it can save a little money in the budget process by passing a limited extension. With a price tag for the R&D tax credit of $75 billion and a lot of political backlash against the President’s multi-trillion dollar budget, I’m sure Congress will be tempted to take the short-term extension route once again.
Funny thing, however, is that the short-term extension route doesn’t really save any money overall. It simply is a budget policy game. The tax credit is still available each year and the money paid out (or technically, the taxes not collected). What it does do is undermine the effectiveness of the programs. With all the uncertainty, the incentives that the tax credits are supposed to provide look to companies to be shaky — thereby rendering them less of an incentive.
In a perverse way, such budgetary games may save the taxpayer money. By reducing the certainty and therefore the power of the incentive, fewer companies undertake activities that qualify for the tax credit. Thus, its expenditures are lower that what it might be. On the other hand, the multiple extensions waste more tax payer money by making the program less effective.
It is like our no-cost approach we are trying to take toward zombie banks (as I mentioned yesterday). “Let’s try to do is as cheaply and as risk-free to the taxpayer’s as possible — thereby guaranteeing that it really won’t work and therefore actually increasing the cost to the taxpayers.” It is like the standard excuse for failure — “well, I’ll try.”
As Yoda once said: “do or do not, there is no ‘try’.” Or ask Nike has told us for years, “Just do it.”
That is my advice to Congress on the R&D tax credit. Just do it!
More on mark-to-myth from an op-ed by James Chanos in today’s Wall Street Journal – We Need Honest Accounting
Mark-to-market (MTM) accounting is under fierce attack by bank CEOs and others who are pressing Congress to suspend, if not repeal, the rules they blame for the current financial crisis. Yet their pleas to bubble-wrap financial statements run counter to increased calls for greater financial-market transparency and ongoing efforts to restore investor trust.
. . .
Obfuscating sound accounting rules by gutting MTM rules will only further reduce investors’ trust in the financial statements of all companies, causing private capital — desperately needed in securities markets — to become even scarcer. Worse, obfuscation will further erode confidence in the American economy, with dire consequences for the very financial institutions who are calling for MTM changes. If need be, temporarily relax the arbitrary levels of regulatory capital, rather than compromise the integrity of all financial statements.
My thoughts exactly.
On Friday, the Department of Energy announced the first award under its loan guarantee program, “a $535 million loan guarantee for Solyndra, Inc. to support the company’s construction of a commercial-scale manufacturing plant for its proprietary cylindrical solar photovoltaic panels.” The program was created back in the Energy Policy Act of 2005 — Title XIII – Incentives for Innovative Technology.
Unlike other programs, this loan guarantee is explicitly not an R&D program. Its purpose is the commercialization of new technologies. It is targeted at overcoming the problem of scaling up from the proven demonstration stage to the full-scale commercial operating stage.
It is interesting to note that the program recognizes the importance of intangibles in the commercialization process. The regulations on the loan guarantee program (10 CFR Part 609) acknowledges that fact by explicitly requiring access to the intangibles be safeguarded if the loan goes into default:
§609.10 Loan Guarantee Agreement
(11) The Loan Guarantee Agreement and related documents include detailed terms and conditions necessary and appropriate to protect the interest of the United States in the case of default, including ensuring availability of all the intellectual property rights, technical data including software, and physical assets necessary for any person or entity, including DOE, to complete, operate, convey, and dispose of the defaulted project;
Given that, it would be interesting to know if DOE’s Credit Review Team required Solyndra to put up its intangibles as collateral on the loan, whether DOE even took those intangibles into account, and how they valued them.
On Friday, I was at a Google DC Talk on a new paper Envisioning the Cloud: The Next Computing Paradigm, written by the consulting firm Marketspace and commissioned by Google. The paper was a good summary of the potential of cloud computing (where the information resides at large data centers and is therefore much more accessible). According to the press release:
The paper outlines key factors that can allow consumers, businesses and the government to realize the full potential of the cloud:
• Full connectivity: Government policies should encourage the deployment of wireline and wireless broadband access so that users can access cloud-based services anytime, anywhere.
• Open access: A combination of market forces and FCC enforcement of existing laws can ensure users enjoy unfettered access to the Web sites and services of their choice.
• Reliability: Competition will continue to drive cloud providers to enhance their reliability. Many companies already offer contracts that effectively guarantee near 100 percent uptime.
• Interoperability and user choice: Because cloud computing is at such a nascent stage, forcing standards of interoperability may actually impede innovation. Because consumers already demand interoperability and portability, the market will drive providers to compete on these bases.
• Security: Cloud providers must make a compelling case to users that their data is safe. While competitive market forces will drive service providers to differentiate themselves on security, the government can play a role by aggressively enforcing cyber-crime laws.
• Privacy: A combination of market-driven policies and government action can best protect the data that consumers and businesses store online. Industry should develop common standards for security and privacy, and institute more protective and transparent privacy policies. Government should shield consumer data from inappropriate government scrutiny and define the rights of companies to use data about their users for commercial purposes.
• Government adoption: The federal government should become an early adopter and fund research. It can also accelerate competitive forces by insisting on standards to enhance privacy, security, openness, sustainability and interoperability.
These recommendations fit with the conclusions of our own report on Virtual Worlds and the Transformation of Business, specifically the recommendation that government policies address the need for the technical infrastructure. Cloud computing is a key enabling technology to allow for a variety of new collaborative technologies, such as virtual worlds. As our Virtual Worlds report points out:
online social networking and Web 2.0 platforms are likely to transform core business operations and interactions with suppliers, customers, and supporting services. Virtual Worlds platforms that form the core of a new corporate operations ecosystem will not only allow for horizontal and vertical interactions but will expand the essential business, partner, and management linkages that enhance productivity over the long term.
Getting the cloud right is an important step toward reaching that potential. This new paper points us in the right direction.
With the release of the Treasury Department’s latest program to deal with toxic assets, it might be helpful to go back to basics. Here is the general problem, as outlined by Tim Geithner in an oped in the Wall Street Journal:
Many banks, still burdened by bad lending decisions, are holding back on providing credit. Market prices for many assets held by financial institutions — so-called legacy assets — are either uncertain or depressed. With these pressures at work on bank balance sheets, credit remains a scarce commodity, and credit that is available carries a high cost for borrowers.
Last November, Hank Paulson made a very revealing comment in an interview in the FT :
“There are two ways of getting at illiquid assets,” he said.
“One is to purchase them and have a price discovery that comes with that and the capital flows that come with that.
“The other is making sure banks have plenty of capital and encouraging them to continue the process of recognising losses and selling these assets.”
For some time, we have been following the latter approach, referred to as “recapitalization” or, by some, as “nationalization.” The new plan is based on the former.
In either case, there are a few major challenges. The financial system needs some incentive to disclose and get rid of the toxic assets (now called “legacy assets”). Some banks have already bitten the proverbial bullet and written-off those losses. Others seem to be unwilling to take the hit – probably for good reason since such a write-down might jeopardize their very existence. In some cases, they appear to be holding out for the “mark-to-myth” solution where market-to-market accounting rules will be suspended and they can value these illiquid assets at whatever looks good for the balance sheet. The danger with this approach, however, is that it simply allows the bad assets to fester. And, as long as the bankers and everyone else is sitting there looking at a pile of junk being counted on the balance sheet, lending will not resume. Therein lies the way to create our own “lost decade” similar to Japan in the 90’s.
Creating the incentives to get the bad assets off the books is therefore key – Paulson’s “encouragement.” The nationalization route, to me, never had enough of an incentive to the banks to take the write-off hit. Bankers could simply take the cash and sit on it. Even with almost compete national ownership, the government had limited control — witness AIG. So I have long favored the bad asset purchase approach.
In this case, however, the political system has put constraints on the process. Politicians and the public understand that the government has to absorb the risk if the banks are to give up the bad assets. But they don’t want the government to put taxpayer dollars at risk. At the same time, they are putting pressure on the accounting system (through calls for suspension of mark-to-market) to lessen the regulatory requirements to write off the bad assets.
In other words, everyone is looking for a no-cost way to get around the basic fact that trillions of dollars of asset value has vanished. As long as the financial system refuses to write off those losses, the system will remain both frozen and fragile.