B2G is moving!
Blogs posted after May 22, 2015 will be located on Deltek's central blog page at
Just select the "B2G Essentials" blog to continue to receive this valuable content.
Open Government Data Ramps Up

This month’s executive order and administrative policy on open data come four years after the launch of and an order tasking agencies to provide at least three “high-value datasets.” The hype is already building around the impact of the next phase of open government data.
Following the executive order issued on May 9, 2013, Making Open and Machine Readable the New Default for Government Information, the Office of Management and Budget (OMB) issued an Open Data Policy establishing guidance for agencies to release data in “open, machine-readable formats.” (Even the policy itself is open.)
Over the next three months, agencies are expected to incorporate this new policy into performance goals. A six month timeline is set for agencies to update policies and create public listing of available datasets. Within 30 days of the policy release, federal chief information officer Steven VanRoekel and federal chief technology officer Todd Park were tasked with publishing “an open online repository of tools and best practices.” Not long after the policy announcement, OMB and the Office of Science and Technology Policy (OSTP) launched Project Open Data, including implementation guidance, tools, resources, and case studies. Within 90 days, OMB will integrated the policy into governance for purchasing of agency IT systems and services.
The Sunlight Foundation’s John Wonderlich responded with enthusiasm to the release of more government data, but he noted that delivery of this data is supposed to be done without any additional spending. Agencies are also supposed to take the “mosaic effect,” piecemeal information combined to pose a risk, into consideration with the information they make public. So, before disclosing information, other publicly available data (in any medium, from any source) could be combined to identify an individual or pose another security concern. This raises question about what datasets agencies will release. As Wonderlich noted, “Concerns like cost, privacy, and security will be used to justify non-disclosure (as they often are), and will be used to try to justify keeping even a description of many datasets private.” This suggestion reiterates that the barriers to delivering high value datasets are not technical ones.
Last May, with the Digital Government: Building a 21st Century Platform to Serve the American People, agencies were directed to create public application programming interfaces (API) the could be leveraged by government and private developers. And, on Thursday, May 23, 2013, government officials will release the final set of almost 300 APIs that will enable users to stream information from agencies to computers, websites and mobile applications. Officials will continue adding API’s to the list after the launch. It’s hoped that this API catalog will enable private companies and non-profits to leveraging government data, as with Global Positioning System data. In remarks delivered in Austin, Texas on May 10, 2013, President Obama explained that greater access to government information will “fuel more private sector innovation and discovery,” yield entrepreneurial opportunities, enable startups, and promote economic growth. It also has the potential to improve the solutions available to government organizations.
The increasing liberal use of the phrase “treasure trove of data” in referring to the measure of information within the government calls to mind another expression about the relative nature of “treasure.”  While developers will put data to work, the value it returns will rely on the viewpoint. For the most part, open government data has been outward facing, related to products and activities of agency mission areas. Inward looking information, related to management and decision making, comes sparingly by comparison. 
Government contractors expand and improve current products and services stand to benefit from the move toward open data. Since, open and machine readable data will be part of governance for federal IT purchases going forward, products and services that currently meet those requirements will be well positioned.

Originally published for Federal Idustry Analysis: Analysts Perspectives Blog. Stay ahead of them competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.

Reintroduced DATA Act Aims to Curb Federal Waste and Fraud

Rep. Darrell Issa (R-CA) formally introduced the Digital Accountability and Transparency Act, known as the DATA Act, on May 21st to the House and the Senate.

The Data Act, would require agencies to use standard formats to share internal and external federal spending information and make it available on a searchable web platform.  The legislation calls for Treasury to establish the data standards in consultation with OMB, GSA, and the heads of federal agencies, and to make the data publicly accessible in a bulk, machine-readable format via an improved site.

The DATA Act originally made its debut in 2012 where it passed the House in April of last year, but died in the Senate after only one hearing.  Sen. Mark Warner (D-VA) introduced similar legislation in the Senate in September. 

The Obama administration is also promoting open data by way of an executive order issued to agencies on May 9th.  However, the purpose and focus of the executive order is to promote continued job growth, government efficiency, and the social good by making federal data more open and widely accessible, whereas the DATA Act is aimed specifically at federal spending transparency and accountability.

The last time around, the Obama administration did not support the DATA Act, complaining that it would create a new set of regulations and rules, and would add more complexity and burden on agencies.  Changes have been made to the legislation since the original 2012 draft, but it is unclear whether or not it will garner support from the White House.  

The bill is backed by the Data Transparency Coalition headed by Hudson Hollister, former counsel to Issa’s committee, and has support of Eric Cantor (R-VA), House majority leader.  The legislation is also backed by government watchdog, the Sunlight Foundation who promotes greater government openness and transparency, and provides new tools and resources for media and citizens.  

Whether the act passes or not, the administration will continue to pursue transparency as a way to stimulate innovation, increase efficiency and reduce waste.  Contractors, grant recipients, and agencies should expect financial reporting requirements to persist and even increase in order to add transparency across government spending and decrease waste, fraud, and abuse.  



Sunshine Week: Transparency of Texas, FY 2012 IT expenditures

Sunshine Week, which coincides with National Freedom of Information Day (March 16), is a national initiative organized by the American Society of News Editors to highlight the importance of open government to the public. In recognition of Sunshine Week, Deltek’s analysts will be taking transparency and contract data collected from state transparency websites and our own GovWin IQ database to highlight IT expenditure trends and procurement analysis in the state and local market.

Today, we take a look at total IT spending for the state of Texas for FY 2012, which was gathered from a top-ranked Texas transparency website. The cumulative spending data collected represents a variety of purchasing vehicles, including purchase orders (PO), statements of work (SOW), procurement cards, and statewide and agency-specific contracts used to purchase IT commodities and services.
Texas spent $132 billion in FY 2012; at $693 million, IT spending only made up .524 percent of that total. In FY 2012, Texas spent 89.2 percent of its total IT expenditure on services over commodities, most of which was spent by the same 10 to 12 state agencies, commissions, and institutions. Health and human services, higher education, justice and public safety, and transportation verticals represented roughly 71 percent of all commodities purchased. Health and human services, justice and public safety, social services, and public finance verticals encompassed approximately 62 percent of all services spending.
Each IT line item (software, hardware, maintenance services, etc.) was grouped under one the following categories: IT and telecom commodities; telecommunications services; IT professional services; and IT and telecom repair and maintenance services.
 Analyst’s Take
With such a concentrated spending pattern, qualified IT vendors looking to do business with the state of Texas are best focusing their efforts on agencies and departments within these top verticals. IT vendors looking to get a share of that $693 million should know more about the state’s procurement process. For instance, Texas has bottle-necked most of its standard IT procurement needs through statewide cooperative contracts, which are handled by the Department of Information Resources (DIR). Most statewide contracts come up for renewal every four to five years, and Deltek’s state and local team monitors these contracts as well as any other more specialized IT procurements not supported by DIR.
A few statewide contracts currently being monitored in Deltek’s GovWin IQ database include:
Deltek will publish a full length report,“State Government Transparency Report 2013,” providing detailed itemized IT expenditures for the state of Texas and many other states in the coming weeks.
GovWin IQ subscribers can learn more about these statewide contracts in the provided links. Non-subscribers can gain access with a GovWin IQ free trial

Social Media Week: How social media has already changed the way we talk to governments

Back when I worked in Washington D.C. at a nonprofit that doled out best practice policy advice to state and local government leaders, part of my job was researching and answering questions from our members. Oftentimes a week would not go by without a city manager or department director asking about social media. Sometimes they asked what the best platforms were to use; other times they wondered about standardized practices and guidelines or legal and ethical pitfalls. Some wondered whether it was even appropriate for governments to be communicating in real time on Facebook or Twitter. Most of the time, they just wanted to use social media to better communicate with their citizenry. As one member put it to me: “In order to speak to the people I serve, I have to go to where they are. And social media is where most of my people are today.”

This powerful need for governments to gravitate toward Web 2.0 tools – even when they don’t fully understand how or why – has stuck with me as I’ve moved to the IT contracting intelligence market. It’s a powerful illustration of not just the need for governments to keep up with the technological habits of their citizens, but also how it can act as a catalyst for rethinking the way we interact with our civic leaders and vice versa. It also shows that there is still plenty of room for experimentation, creativity and selling in the private sector when it comes to integrating these tools into the IT arsenal of agencies, universities and policymakers.
Social media as a portal for G2C interaction
Source: “Social Media in State & Local Government: A New Paradigm for Engagement and Innovation”, Deltek 2012
By now it is no secret that one of the best ways to use social media in government is G2C, or government-to-citizen applications. As a direct communication or public relations tool, these apps can only be so distinct from the information presented on a government’s Web page or a written press release disseminated to newspapers. As a Canadian government official asked while delivering a speech last year on the potential of the medium: “How are … social media and interactive websites changing how public institutions conduct their business? Is the change profound or are we just replicating the use of traditional media on new platforms?”
Where the technology really distinguishes itself is through its ability to coordinate real-time, citizen-produced updates to their government in order to coordinate more efficient action on a range of issues or problems that plague every city, county and town across the country. Larger cities or states with sophisticated or well-funded budgets might want to emulate the U.S. State Department’s CO.NX program, which connects users to officials through Web and video chats on a range of public policy issues. It would not be difficult for states to model and merge a program like this into their existing 311 call center technologies to provide a wider range of services relying on online interaction. San Francisco, Calif., currently integrates several social media platforms into its 311 system, providing citizens an easy and familiar portal to submit a help request when they may have no idea who to contact about a downed tree outside of their apartment.
The spread of social media over the past 10 years has proven to be exponential. A pair of University of Illinois studies on local government social media use found a dramatic increase in government-to-citizen interaction from 2009 to 2011.
“The change in social media adoption is remarkable - increasing from two to five times over the levels observed two years ago,” according to the authors. The integration of popular sites like Facebook, Twitter and YouTube into government operations increased from 250-600 percent.
Source: “Social Media in State & Local Government: A New Paradigm for Engagement and Innovation”, Deltek 2012
Of particular interest to government contractors should be the rise and relatively untapped potential of open data portal technology. These portals often need to be customized to fit an individual government or agency’s information-sharing needs, and thus cannot always be purchased off the shelf or adapted from available  (and free) social networking applications. Oftentimes governments will need third-party expertise to design and integrate these portals into their existing IT infrastructure. Making this information easily accessible and user friendly to those outside of government is crucial to the growth of this technology, something that policymakers who have spent a lifetime in government may not be best suited for.
Of the 75 largest U.S. cities included in the University of Illinois study, 12 reported the use of open data portals in 2011. That is a large jump from several years earlier when such portals were almost unheard of in local government; vendors can expect this trend to increase exponentially over the next five years throughout large and small state and local governments.
Still, we are just scratching the surface when it comes to the potential of G2C to meet the unique needs of state and local government. According to State Tech Magazine (which has a treasure trove of local government social media infographics that I cannot recommend highly enough), smartphone users will download more than 76 billion apps in 2014, and the app industry as a whole is expected to generate $55 billion of business by 2015. That graphic also does a fantastic job of showcasing some of the more innovative state and local G2C apps in the country, from the Sacramento, Calif., app that shows users the results of a restaurants latest food inspection; to Chicago’s Taxi Share app, which pairs up users heading in the same direction; to the city of Sparks, Nev., which has a mobile app guide to local stores, restaurants, hotels and events. Other graphics display the effective use of social media in public safety (Did you know that social media evidence used for a search warrant is accepted in court 87 percent of the time?) and show how big data analytic tools are changing the way governments approach and solve big challenges.
Social media use in emergency management
The other great early success of G2C interaction is in disaster and emergency management, where it has had a dramatic effect on how governments manage and coordinate their response to large-scale weather and public safety threats. Nowhere else is the ability to communicate back and forth between citizens and government more important than during a large-scale emergency, when traditional modes of communication may be down or overloaded. While social networking platforms are not entirely immune from these externalities, they do provide an excellent venue for micro-targeting a public safety organization’s response and identifying as well as prioritizing resource allocation to ensure maximum efficiency.
Twitter has proven to be an especially effective tool to this end, both because of its popularity across demographic lines and its simplified setup. According to a 2012 Pew report, Twitter usage among the ages of 18-44 range from 16-31 percent. That may not sound like much at first glance, but it is often more than enough people required to create an information “snowball effect” where advice and guidance can spread effectively throughout an affected population. That is also the age group most likely to be physically able to provide assistance to other citizens during an emergency. Think “pushing cars out of snow traps during a blizzard” or “going door to door to help evacuate elderly citizens in the aftermath of a flood.”
Source: “Social Media in State & Local Government: A New Paradigm for Engagement and Innovation”, Deltek 2012
Public safety organizations were among the first to realize the potential of integrating G2C communication into their operations, and it has changed the face of emergency response in some amazing ways. A 2009 report by the International City/County Management Association on local government social media use during emergencies illustrates the multitude of ways Web 2.0 has helped localities mitigate the damage of a disaster with both proactive and reactive examples provided. The report looks at a geographically diverse set of case studies by enterprising localities as they utilized Facebook, Twitter, YouTube, Foursquare and text alerts to more effectively prepare and respond to floods, tornados, snowstorms, the H1N1 flu virus and other emergencies.
It is important to understand that the use of social media by emergency management and public safety agencies is not a one-way street. In addition to the G2C interactions, many public safety agencies provide forums on Twitter, Facebook, and even Pinterest to enable citizens to provide valuable information in a variety of areas. For instance, public safety agencies may receive information on emergency situations such as car accidents, robberies or other incidents that may leave the individual unable to make a phone call. Being able to tweet or send other messages to a public safety agency enables anyone to contact their local police agency, assuming they have a presence on social media.
For example, during Hurricane Sandy, the New York City 911 System was completely overloaded due to the receipt of 10,000 calls per hour. To put that in perspective, the city typically receives 1,000 calls per day. The use of social media tools during the hurricane skyrocketed with Instagram users posting 10 photos per second, leading a whopping 86,000 images in a 24-hour period. And this is just one of many social media sites.
This is not to say that these photos or other social media message directly saved lives, but the onslaught of information allows emergency management agencies to understand the who, what, when and where during an emergency.
When citizens can directly engage with public safety agencies, those agencies may find it necessary to utilize social media tools. In March 2012, Deltek looked at the possible rise of social media management software, which allows agencies to utilize social media for direct engagement with the community and to sift through information sent by the public. While this type of software system has not taken off across the country, it may become a necessity as more agencies utilize social media tools.
Social media use in higher education
According to a 2012 Pearson survey on social media by universities, nearly two-thirds of higher education faculty use social media on a monthly basis, and nearly half (45 percent) for professional purposes. When it came to social media use in the classroom, the number was significantly lower (a little more than one third), but in all cases there was a very strong age correlation, with younger professors reporting much higher rates of social media use than their older counterparts. This trend indicates that the overall proportion of professors and faculty incorporating social media into their curriculum should only increase as time goes on. Of the top Web 2.0 platforms used in class, blogs and wikis were the most prevalent, with podcasts and Facebook ranking second and third. Other popular platforms like LinkedIn and Twitter had very low usage rates, indicating that faculty has yet to figure out a proper teaching use for these sites.
Source: “How Today’s Higher Education Faculty Use Social Media”, Pearson and Babson Survey Research Group, 2012
As with governments, social media can also be utilized to great effectiveness during disaster or emergency management crises, such as severe weather or school shootings. After the infamous Virginia Tech shootings in 2007, a research group led by Leysia Palen of Educause documented how social networking was integral at disseminating information at almost every step in the immediate aftermath, from students using text and instant messaging check on the safety of their peers, to the use of IM and Facebook to spread breaking news to alumni and the outside world, to Wikipedia updates on the shootings just an hour after the university sent out its first emergency alert.
The result of all this social media activity, the authors argue, was a “distributed problem-solving” model that was “collective and bottom-up rather than orchestrated and top-down.” This allowed lists that correctly identified victims to emerge online well before the university officially released that information. The study also examined 29 Flickr groups across six different disasters from 2004-2007, and found a distinct pattern whereby a few central accounts began rapidly aggregating images of the disaster as well as the accompanying media coverage, providing on-the-ground coordination and reporting that would be almost impossible to reproduce with Web 2.0.
The greatest potential for contractor involvement is with opportunities around social media management software and applications that can supplement off-class learning and virtual learning environments. As a range of technological trends converge over the next five years (virtual learning, BYOD, online universities), the need to connect teachers, students and resources all from one device will only accelerate. Social media and social networking functions already have permanence in the marketplace, and they will only continue to become more integrated with the way people, governments and education institutions communicate and disseminate information in the future.
For more information on this topic, download the free summary of the Deltek Report on Social Media in State and Local Government, here.
Or go here if you want to purchase the report in full.
Not a Deltek subscriber? Click here  to learn more about Deltek’s GovWin IQ database and take advantage of a free trial.

Arizona Releases Statewide Strategic IT Plan For 2013 To Improve Efficiencies

The Arizona Strategic Enterprise Technology (ASET) Office has released a Statewide Strategic IT plan for fiscal 2013 which builds on Governor Janice Brewer’s agenda by leveraging technology to enable a “more innovative, efficient, and sustainable government.”
In July 2011, Arizona’s Government Information Technology Agency merged with the Arizona Department of Administration’s Information Services Division to form the ASET Office, which now develops and executes the statewide IT strategy, while providing capabilities, services and infrastructure to ensure the continuity of mission critical and essential systems for the state.
As part of the Statewide Strategic IT Plan, which was developed in conjunction with the Governor’s Office, eight transformational initiatives were identified, defined and scoped to develop the strategic plan for 2013. These initiatives are expected to have a significant impact on the state as a whole - ensuring the business continuity and security of statewide assets, while providing citizens with the ability to access state services anywhere, any time.
Below, we highlight each of the initiatives which encompass the strategic plan for 2013, and detail how these will help the state moving forward:
1.     Implement a Continuous Improvement Culture - As part of the Governor’s commitment to reform state government, the Government Transformation Office (GTO) was established within the Department of Administration to implement a statewide continuous improvement program focused on education, process improvement projects, and capital impact.
Moving forward, Arizona will coordinate with the GTO to adopt improved and efficient policies and procedures. This coordinated effort will result in automating policies and procedures that are free of waste and inefficiency.  With an emphasis on service excellence and customer centricity, Centers of Excellence will be established throughout the state, which will offer recognition and reinforcement to best practices, while providing an opportunity for continued shared learning, as well as a continuous improvement culture.
2.     Accelerate Statewide Enterprise Architecture Adoptions and Asset Management - Over the past year, Arizona has made significant progress on the adoption of a statewide Enterprise Architecture (EA) strategy and framework. An EA advisory committee was established, a charter was developed and ratified, and an EA framework was selected. Accelerating this planning methodology throughout the state will result in “a more agile, efficient organization with more effective decision-making capabilities.”
As part of the EA expansion, Arizona will start with an assessment of technology contracts, infrastructure and applications. It will also begin to define and adopt a statewide Data Governance Model to improve the quality and accessibility of information. Together, these
capabilities will accelerate the business decision-making process, streamline the planning and procurement of statewide assets, and reduce the overall cost of doing business.
3.     Implement A New Statewide Enterprise Resource Planning (ERP) Solution - In January 2012, Governor Brewer addressed her plan for operational reform. The state’s accounting system which is a central operational system for the state’s employees, customers, and vendors is an outdated system with antiquated software and no external support. When the system fails, the consequences will span beyond the state and will ultimately impact our schools, businesses, and community. Therefore, Arizona plans to implement a statewide ERP system that will replace the Arizona Financial Information System (AFIS) and a number of other central and agency-specific administrative systems. It will also provide new administrative system functionality that will benefit the entire state. The benefits from replacing this outdated system will be more efficient and effective business processes, better informed and faster decision making, and improved business continuity.
4.     Expand E-government and Mobility Capabilities - In order to fulfill its vision, Arizona will begin to develop a statewide web platform to provide agencies with full content management functionality, mobile compatibility, and user identity management. Ultimately, this will allow agencies to deliver services faster, more consistently and securely, and to any device utilized by its citizens.
The state web portal ( is the gateway to Arizona, which contains invaluable information about how citizens work, live, play, and interact with state government. A collaborative approach with key stakeholders will be established to modernize the state web portal by developing a new design, adding new capabilities, and making it easier for citizens to access state services.
5.     Implement Critical Business Continuity Improvements at the State Data Center - The State Data Center currently houses technology systems that are mission critical to the continuity of business. There are more than 140 state entities that leverage the data center’s infrastructure, services, and capabilities. Ensuring these systems are operational and secure is absolutely critical to the functions of the state. Arizona will begin initiatives to upgrade critical aspects of the facility itself, ensure redundancy and continuity of critical systems, and increase capacity to support the growing number of agency customers.
In addition to upgrading the current environment, Arizona will also facilitate the foundation of a cloud-computing environment by beginning to build a comprehensive virtualization infrastructure. By providing capabilities such as self-provisioning, service monitoring, and capacity management, the state will begin to provide state agencies with a cost-effective model for moving to “the cloud.” This will also allow for an improved way to plan and manage the cost of IT. Moving IT costs from a capital expenditure (CAPEX) to an operational expenditure (OPEX) model will result in a consistent sustainable model that will improve IT cost planning.
6.     Implement a New Statewide Infrastructure & Communications Network -The AZNet program was established several years ago to ensure Arizona has a cost effective, efficient and consolidated shared telecommunications infrastructure to meet the needs of government agencies, their employees and the public. The next generation of the program is in progress to refresh the current infrastructure. This refresh will be an expansion of the central ring, which will extend out network capabilities to agencies that are currently unable to receive services on the state network. Ultimately, this program will provide improved business continuity, reduced costs, and improved connectivity.
Additionally, the Digital Arizona program is playing an active strategic role in changing the definition of infrastructure and addressing middle-mile issues throughout the state of Arizona. The impact of this program is far reaching and will benefit education, economic development, public safety, and healthcare.
7.     Enhance Statewide Security and Privacy Capabilities and Training - Protecting citizen data, as well as the privacy of state employees, are of the highest priority for state agencies. Due to the sensitivity of government data, the state’s environment of diverse technologies and data sources requires adoption of robust and effective operational security and privacy programs.
As part of this initiative, Arizona will strengthen cyber security and privacy operations by supporting essential cyber-security technologies and continuing the implementation of a single-sign-on solution for all state employees. In addition, Arizona will optimize incident reporting and deploy an enterprise log aggregation solution for real-time threat detection and notification. Lastly, the state will strengthen cyber-security awareness by providing state employees with training on security and privacy policies, standards, and procedures that are essential to preventing security and privacy incidents.
8.     Streamline Project Oversight, Improve Transparency and Strengthen Project Management - To truly transform state government, it’s critical that Arizona clearly defines its project deliverables and executes with precision. This requires a level of maturity in several areas including program and project management, as well as oversight. In addition, Arizona must improve efficiency, increase transparency and escalate accountability in the state’s project oversight process.
Leveraging the GTO and lean principles, Arizona plans to simplify the Project Investment Justification (PIJ) document and streamline the process from end to end. Through automation, Arizona will provide agencies with the ability to self-report and provide more accurate, current, historical, and aggregate reporting capabilities.
Our Take:
Overall, Arizona’s Statewide Strategic IT Plan outlines the various steps and investments the state is planning to make for fiscal 2013. As part of this IT transformation, Deltek expects opportunities in the areas of IT refresh, systems integration, communications, cybersecurity and IT services to arise as a result of these strategic initiatives.
Looking ahead, interested contractor’s should use Arizona’s past strategic IT projects to create business development justifications for IT solutions that will allow the state to accomplish its goals of creating greater efficiencies, while providing an innovative, sustainable government.


On the Precipice: Big Data Will Change Everything

As we wrap up our soon-to-be published report on Big Data, I can’t help but think about how profoundly technology will change our world in the years ahead.  Big Data will drive great changes, but what kind will depend on what we do with it. No agency can afford to do Big Data just for the sake of organizing their data. The cost of capturing, managing, processing, storing, and analyzing data is just too high to not have a sufficient payoff in terms of the value that the data produces.

It has to yield real, actionable results that create substantial value to be worth doing. For example:
· Finding a cure for cancer
· Discovering how best to manage diabetes to dramatically improve health and lower costs
· Finding energy sources in our galaxy
· Finding ways to avoiding loss of life from a natural disaster
· Identifying and capturing terrorist before they strikes, and
· Helping make American industries more competitive in world markets and creating jobs.
The value of big data is in the knowledge, wisdom, quality of decisions it enables, and the impact of actions that can be taken as a result of harnessing the power of data. Information is power in that it produces wisdom for good governance.
But the power that Big Data will create must be contemplated within an important cultural and ethical context. There is no doubt that Big Data analytics will and has already opened up a Pandora’s box of ethical questions about the role of government. The recent public concerns about domestic use of drones and potential for infringement on the privacy, liberties, and constitutional rights of individual citizens, is an example. Michael Stonebraker, an MIT electrical engineering and computer science professor specializing in database research, also sees the dark side. "Privacy is going to be a huge issue [with big data] and it's largely going to be a political issue,” he said in an InformationWeek article entitled “Why Big Is Bad When It Comes To Data” (July 19, 2012).
President Eisenhower foresaw this day and, in his farewell address to the nation on January 17, 1961, he expressed concern that our national resources of human innovation and science and technology, if left to market forces concerned only with maximizing profit with no regard for ethical, just, and righteous uses and the impact on generations to come, would cause the deterioration of the very things that have made this nation great. It serves us well to remember what he said:
“In…the technological revolution during recent decades,….research has become central, it also becomes more formalized, complex, and costly. A steadily increasing share is conducted for, by, or at the direction of, the Federal government.
Today, the solitary inventor, tinkering in his shop, has been overshadowed by task forces of scientists in laboratories and testing fields. In the same fashion, the free university, historically the fountainhead of free ideas and scientific discovery, has experienced a revolution in the conduct of research. Partly because of the huge costs involved, a government contract becomes virtually a substitute for intellectual curiosity. For every old blackboard there are now hundreds of new electronic computers.
The prospect of domination of the nation's scholars by Federal employment, project allocations, and the power of money is ever present – and is gravely to be regarded.
Yet, in holding scientific research and discovery in respect, as we should, we must also be alert to the equal and opposite danger that public policy could itself become the captive of a scientific-technological elite. It is the task of statesmanship to mold, to balance, and to integrate these and other forces, new and old, within the principles of our democratic system – ever aiming toward the supreme goals of our free society.”
Recommendation: As vendors pursue federal big data business they stand on the precipice of a whole new world. No one can afford for them to get lost in the data and not be mindful of their responsibilities in shaping the future for upcoming generations. Vendors would be well-served, and will serve the federal government’s interests and that of the people it represents, by reading Eisenhower’s speech in full and keeping always in mind the things they were challenged and charged with in their college Business Ethics class.




Hawaii Details 12-Year IT Roadmap To Streamline Business Processes While Improving Efficiency

Earlier this month, Hawaii unveiled a plan to overhaul the state’s use of technology to streamline business processes to improve the delivery of government programs and services.
As part of this, the newly-created Office of Information Management and Technology developed a 12-year roadmap which outlines the necessary steps that will drive the decade-long business and technology transformation. The Transformation Plan was developed in consultation with state agencies and after a thorough review of national best practices and lessons learned from other states. The effort is one of the key initiatives under Governor Neil Abercrombie’s New Day Plan, which calls for a transformation focused on jobs and investments in people to “ensure long-term economic prosperity and resilience.”
“A solid foundation must be built that will enable the state to continuously adapt in order to provide services, now and in the future,” said Hawaii CIO Sanjeev “Sonny” Bhagowalia. “This 12-year plan, which includes two years of planning and 10 years of implementation in multiple phases, will revolutionize the way information is managed to improve how programs and services are delivered to the public.”
As reported, Governor Abercrombie appointed Bhagowalia as its first chief information officer in 2011, after recognizing that a large-scale effort was needed. Hawaii said the state has “not significantly invested in technology for more than 30 years,” while noting that “transforming the state’s $11 billion business enterprise with 220 business functions and services across 35 distinct lines of business is an enormous endeavor.”
The Transformation Plan will morph the current paper-based and inefficient business environment into a future environment that is more cost-efficient, digital, and mobile-accessible. It will also consolidate the state’s 743 fragmented legacy systems into fewer, but, integrated, enterprise-wide solutions that facilitate improved information sharing.
Currently, Hawaii spends about 1.4% of its annual budget on technology, while most states invest around 2% to 3%. Industry best practices suggest spending between 3% and 5% of the annual budget on technology to realize the greatest benefits.
  1. Streamlining and improving current business processes and applications to directly benefit the public.
  2. Leveraging the state’s investment in shared support services and technology infrastructure.
  3. Establishing a strong organization-wide management and oversight framework, including policies, processes, performance measures, program management and organizational change management.
As part of the transformation, Hawaii has identified 11 top strategic technology priorities, which include:
  1. Enterprise Resource Planning (ERP) - Hawaii is moving forward with implementation of an enterprise-wide ERP system that will replace the large majority of the current central systems within the Enterprise Support Services band.
  2. Tax Modernization - This involves a strategic initiative to explore ways to streamline and modernize tax processing away from the current Integrated Tax Information Management System (ITIMS). It will expand the overall use of electronic tax filing, electronic payment, improved analytics, and improved case management processing to streamline and decrease cycle times for the citizens of the state.
  3. Health IT - Envisioning a more effective, efficient, patient-focused healthcare system, Hawaii’s Transformation Plan includes a four-point strategy of innovations for Delivery System Improvements, Payment Reforms, Health IT and Healthcare Purchasing. Hawaii is seeking systemic improvements in public health through measuring health status, performing assessments, and the tracking of preventions, promotion, and outcomes. The State will look to use Electronic Health Records and a secure exchange of information to improve care coordination, reduce duplication and waste, empower patient engagement in their health, and enable public health analytics to shape policy decisions that will improve the overall health system.
  4. OneNet/Enterprise Services Network - A single network, OneNet, will look to fulfill the network needs of all state departments and employees and citizens with guaranteed performance levels.
  5. Adaptive Computing Environment (ACE) - Establishes a consistent configuration for computing devices across the State using pre-approved vendors. State employees can order standard systems that are engineered to operate most efficiently in the OneNet environment. Choices are provided based on job classification for mobile/tablet solutions, laptop/desktop, or a strictly virtual environment for certain work. These systems require fewer support resources than non-standard configurations, enhance overall support effectiveness, and reduce total cost of ownership.
  6. Shared Services Center - For the future State vision, the goal will be to have five fully meshed functional shared services centers (SSC) distributed across the islands to provide high availability, redundancy, fault tolerance, data backup and replication, disaster recovery, and always-on services to Hawaii. Connections between shared services centers will be provided with dedicated high-speed fiber optic lines with service providers and State wireless connections acting as redundant and backup links respectively.
  7. Information Assurance & Privacy - Hawaii has a fully integrated Security Operations Center (SOC) and Computer Security Incident Response Centerv (CSIRC) to: provide uninterrupted security services while improving security incident response times; reduce security threats to the State; and enable quicker, well-coordinated notification to all State Departments regarding security threats or issues.
  8. Mobile ComputingHawaii is aiming to establish a standard mobile applications solution pattern and approach with standard methods, skill development, contractor resources, and tools/technologies in conjunction with the adoption of preferred smartphones and tablets. Since mobile application development has a very small footprint in the State at this time, this initiative will need to analyze, pilot, and invest/implement in a standard approach, capabilities, and tools for developing mobile applications.
  9. E-Mail, Collaboration and Geospatial This effort is looking to provide several integrated services in a single environment, including integrated multi-media online communications services; collaboration and conferencing services, and multimedia content and information services.
  10. Open GovernmentSeeks to establish a State of Hawaii internal and public-facing website to facilitate the sharing of master data sets.
  11. Hawaii Broadband - Hawaii currently has many broadband projects underway as part of the Hawaii Broadband Initiative (HBI) with departmental participation that highlights the importance of the program. An assessment of the current program illuminates the fact that there must be strong unification of these disparate efforts within an established, disciplined program management framework with continual progress reports.
Our Take: Overall, we applaud the actions taken by the new CIO to outline how the State of Hawaii can streamline its operations while improving efficiencies through the use of technology.  With this in mind, Deltek expects opportunities in the areas of IT refresh, systems integration, portal development, broadband, and IT services to arise as a result of these significant IT efforts.
In terms of contracts, Hawaii currently has over 154 active GovWin tracked opportunities. The following is a breakdown of Hawaii’s top 5 opportunities (in terms of value) across all market verticals:
  1. Pharmacy Benefit Management Services and Fiscal Agent Services In Support of Pharmacy Claims Processing; Value: >$30 million; Primary Requirement: IT Professional Services; Award Date: February 2013.
  2. Statewide Telecommunications Equipment; Value: >$30 million; Primary Requirement: LAN/WAN Equipment; Award Date: January 2014.
  3. Hawaii Broadband Initiative; Value: >$30 million; Primary Requirement: Fiber Optic Materials & Components; Award Date: June 2013.
  4. Health Insurance Exchange Services; Value: <$30 million; Primary Requirement: Information Technology; Award Date: November 2012.
  5. Third Party Administrator (TPA); Value: <$30 million; Primary Requirement: Information Technology; Award Date: July 2013.


Wanted: Sexy Data Scientists

One of the biggest inhibitors for Big Data is the lack of professional data scientists. These quant jocks are in high demand for their ability to extract sought-after information from large datasets and present them in a way that decision-makers can readily understand, see the value, and act upon.
The profession is the sexiest career of the 21st century, according to The Harvard Business Review. While I’m not so sure about the sexy part, they will certainly be in high demand. Data scientists devise theories, design experiments, and test hypotheses to extract relevant information from data, such as identifying patterns that can be used to predict behavior, using technology tools. They are part scientist, part researcher, and part computer programmer.  Data scientists usual work together with subject matter experts, to understand the domain under investigation and to ensure that their hypotheses and design are relevant.
It is actually hard to know just how sexy these folks are because there are so few of them and therein lies the problem. The availability of data scientists is not likely to keep pace with federal demand. According to a 2011 study by the McKinsey Global Institute, the U.S. could face a shortage of up to 190,000 data scientists by 2018.   Data science is a new profession and there are few practitioners.  It originated in academia in computer science and management science and began migrated from academia into industry just a few years ago when Wall Street firms required quant jocks.  However, they have thrived in the halls of government research and development labs, centers, and institutes for years, in places like DOE Labs and NASA. No doubt, some of those were contracted from industry, but the best probably have long experience as government employees. 
As part of the White House Big Data Initiative, announced in late March 2012, the National Science Foundation (NSF) is “encouraging research universities to develop interdisciplinary graduate programs to prepare the next generation of data scientists and engineers.” They are also providing “a $2 million award for a research training group to support training for undergraduates to use graphical and visualization techniques for complex data.”  These training initiatives, as well as re-training initiatives, will mitigate the shortage, but they won’t provide the talent that will be needed in the short-term.  
As the Big Data competition heats up, it will be interesting to see vendors scramble for data scientist talent. And, no doubt, vendors will make it lucrative for the best to retire from government service and step into the consulting profession or the contracting pool. But attracting away the best and bright government data scientists might actually slow growth in the market even more as they are also sometimes the agency subject matter experts as well.
Contractors are likely to find that some of their best internal people with subject matter knowledge and IT know-how might be able to make the leap with some added training in statistics and analytics. Universities are also ideal places from which to recruit talent. As visualization tools improve making the data scientists’ job easier, more people will be able to transfer some of their existing skills to make the transition into the field.
With federal budgets tightening and the threat of sequestration, government contractors are looking for ways to make themselves more attractive.  Moving to growth niches is one way and the Big Data niche will be sexy for years to come.

Government 2.0: Government Trends - IT procurement transformation (Part 2)

Continuing from my previous blog on Government  2.0’s effect on the traditional IT procurement process, I wanted to take a look at trends in government’s approach to acquiring Gov 2.0 technology. Part 1 of this blog series highlighted how small Gov 2.0 IT firms have begun to use non-traditional purchasing options to circumvent the traditional procurement process. Gov 2.0 firms are trying to avoid the procurement process because it has historically been more difficult for small IT firms to compete in the government IT market. However, today’s trends in IT procurement hint that times are changing. Since governments continue to face shrinking IT budgets against expanding IT costs and needs, they are now looking for alternative ways to do business as well. For many IT bureaucrats and contracting officers interested in Gov 2.0 technology, that means looking outside of the conventional procurement process, and toward smaller IT firms.


Code for America (CFA), an example of Gov 2.0 realized, is an organization that describes itself as “Peace Corps for geeks.” Established in 2009, CFA assigns programmers on year-long fellowships to work with local governments on in-house IT projects, to provide faster and more affordable alternatives to procuring vendor services and solutions. CFA noticed that new IT products, which it calls “civic startups,” were often created once the fellows had completed their assignments – essentially spawning new businesses. However, these civic startups that had created products for governments were having trouble selling their products. Finding the government procurement process difficult to navigate, many fizzled.


In response to this issue, CFA is setting up its first civic incubator, where a handful of IT entrepreneurs will participate in a five to six-month-long program that will provide funding and mentoring, while bringing their applications and solutions directly to local governments and school districts. This incubator is something CFA’s leadership hopes will turn the traditional procurement process on its head.


Another noticeable trend on the rise is localities across the country sponsoring crowdsourcing events and hackathon competitions as an alternative approach to the traditional solicitation processes for Web development solutions and services. In 2010, after working with CFA, the city of Boston took a one-day hackathon event to the next level by creating a permanent office within local government. The New Urban Mechanics Office was created to hire full-time programmers for in-house IT development projects as well as to conduct outreach to encourage, field, and partner with small IT entrepreneurs.


In another CFA spinoff, White House Chief Technology Officer Todd Park created the Presidential Innovation Fellows (PIF) program to  pair “top innovators from the private sector, non-profits, and academia with top innovators in government to collaborate on solutions” using technology. On August 23, 18 innovators from outside of government were selected to work on one of five projects over the course of six months. The PIF program’s goal is to synthesize open data, expand e-government services, and simplify the RFP process by “building a platform that makes it easier for small high-growth businesses to navigate the federal government, and enable agencies to quickly source low-cost, high-impact information technology solutions.” 


One of the five projects, RFP-EZ, was born after Sean Green, head of the Small Business Administration’s Investment and Innovation Program, remembered an instance with the Department of Health and Human Services (HHS). The department had a need for an IT project, but the project’s estimated cost was $5 million to implement by a traditional IT vendor. After some research and outreach, HHS partnered with smaller IT companies that were able to complete the project for just more than $400,000.


During the PIF announcement, Green gave an open call to developers, contracting officers, and small Web development firms to join the effort and participate through, which is an open platform where RFP-EZ will be demoed. If successful, many state and local level governments will likely partner with or imitate the RFP-EZ project. Park took to Twitter after the announcement to confirm that the PIF will also be working with both state and local governments, in addition to the federal government, to create these IT solutions.


Analyst's Take


At all levels of government, innovation and affordability have been contradictory terms. The Code for America debunked the idea that quality Gov 2.0 solutions and services had to come at a premium. CFA seems to have been the catalyst for the future direction of Gov 2.0. Governments willing to take the early leap by circumventing their IT procurement process and engaging with innovators directly can expect some growing pains, but they will likely be dulled by ultimate cost savings.


The traditional format of government procurement has been grounded in the 1950’s-style door-to-door salesman. Governments release a solicitation and wait for the salesman to ring their door bell to peddle goods. Now, government agencies with strict budgets have the option to shop around without going through a lengthy and expensive procurement process. Governments want convenience and efficiency without having to sacrifice quality and they are willing to go outside the traditional procurement process to get it. 

Subscribers have access to the full article here, including expanded analysis and recommendations for contractors.
Also, be sure to follow Deltek's General Government Team on Twitter @GovWin_GenGov.

New York City hands over significant IT procurement authority to new, independent agency

Obtaining procurement information from the New York City Department of Information Technology and Telecommunications (DoITT) has never been easy. The sheer size and sophistication of the agency (between $750 million and $1 billion in IT spend per year), combined with the global recognition that comes from operating in one of the United States’ flagship cities, makes procurement a strong buyer’s market.
The DoITT does not have an overwhelming need to advertise bids because every technology vendor who does business in New York is already lined up at the door waiting for the chance to provide their services. There is rarely a need to engage vendors in conversations about upcoming projects because DoITT is large enough and specialized enough to identity, plan and build the technical requirements for most IT solicitations. Perhaps the most emblematic example of Gotham City’s attitude toward procurement transparency can be found in its FOIA policies, where there is a year-long backlog of requests clogging the system. In a competitive procurement context, such a backlog virtually guarantees that by the time a vendor receives the information they’ve requested, it is far too late to put it to use.
You would think all this opaqueness and refusal to provide information to vendors would lead to a clean, corruption and scandal-free procurement environment. However, DoITT procurement has become a routine embarrassment to New York City Mayor Michael Bloomberg’s administration over the past year, with major cost overruns and improper spending plaguing the city’s 911 dispatch system (more than $1 billion over budget), payroll system ($650 million over budget), and personnel system ($300 million over budget). Yes, that’s $2 billion in cost overruns alone for these three projects over the past 12 months.
Frustrated by mismanagement and corruption, Bloomberg’s office announced last month that it was creating a nonprofit corporation to oversee all technology projects that exceed a certain dollar threshold – $25 million for single-agency projects, and $5 million for multi-agency projects.
For the complete version of this Analyst Perspective, click here (subscription required).