B2G is moving!
Blogs posted after May 22, 2015 will be located on Deltek's central blog page at
Just select the "B2G Essentials" blog to continue to receive this valuable content.
TTC’s Big Data for Defense Symposium Offers Insight into Air Force and Army Programs

It’s become a sure sign of autumn for me when the Technology Training Corporation’s annual big data for defense and homeland security symposium rolls around in September.  TTC always manages to get top-notch speakers from both government and industry and this year’s symposium was no exception.  The event takes up two days and is hosted at the Holiday Inn in Rosslyn, VA.  These notes and comments provide a couple of highlights from the symposium.

Jeff Eggers, Chief Technology Officer in the Office of the Deputy Chief of Staff for Intelligence, Surveillance and Reconnaissance of the US Air Force (AF/A2D) began by providing an excellent overview of the Air Force’s recent efforts to enable the use of big data analytics in operational/tactical environments.  Stating up front that the Air Force is reviewing big data concepts and methods to dramatically change the way it processes and uses sensor intelligence, Eggers assured the audience that the goal of Air Force efforts is standardizing sensor data feeds to make all data discoverable.  The standardized data will pass through automated tools and go to so-called “all source” analysts for the first stage of analysis before it is distributed to warfighters for use on the operational level.  An example of such use would be identifying targets for precision fires.

Processing data quickly, however, is the key to making it usable.  To that end the Air Force is dedicating funds to implement what it calls Sensing-as-a-Service.  SensaaS is the concept of making all data from multiple sensors available via a single delivery platform.  The sensors are embedded in a system of systems, like the Distributed Common Ground System-Air Force, and the data and analysis would be made available to users as a web-based service or via a battlespace network.  SensaaS is currently in the research and development stage, but Eggers’ says he’s been assured the concept is workable.  From an industry perspective this suggests that additional investment is coming from the Air Force to field a proof of concept prototype.  Such an approach would be consistent with defense acquisition initiatives to make greater use of prototyping in procurement phases.

Lisa Shaler-Clark, the Deputy Director in Program Manager – Futures at Army Intelligence and Security Command (INSCOM), followed Mr. Eggers later in the morning with some fascinating comments on work being done to integrate Army intelligence with the Intelligence Community’s IC IT Enterprise, or ICITE program.  Shaler-Clark noted that Army INSCOM has made great strides moving data from stovepiped systems into an enterprise data warehouse.  This warehouse provides analysts with vastly improved data access, but it has also created a deluge of data for them to deal with.  The solution to that problem for INSCOM has been to host a Hadoop-based cloud analytics system to parse the data.  The data is tagged in multiple ways and then made available for analysis via a number of automated tools.  Data is also integrated into the ICITE and INSCOM is leveraging the NSA’s cloud for additional storage.

Finally, from the sound of what’s happening there, INSCOM is one of those places you’ll need to visit if your company sells analytics capabilities.  Be aware, though, that Shaler-Clark’s office isn’t interested in capabilities that duplicate what they already have.  They want new capabilities that enable them to do what they cannot already do today.

TTC is planning to follow up this symposium in November with its first conference on the Internet of Things.  This conference, Internet of Things for Defense and National Security, will be held on November 13-14 in Arlington, Virginia.  The line-up of speakers that I've seen so far looks very interesting.  Hope to see you there.



Continuous Monitoring Program Stalled, New Policy Forthcoming

In August 2013, the Department of Homeland Security (DHS) issued awards to seventeen vendors for a potential $6 billion contract to support a government-wide network threat monitoring program. On the heels of those announcements, the program implementation faced hurdles around legislation and budget. While progress seems to have been stalled during the shutdown, the Office of Management and Budget (OMB) is preparing to issue new policy providing direction to agencies on implementing federal information system continuous monitoring (FISCM).

Part of a massive effort, DHS’s Continuous Diagnostic and Mitigation (CDM) program will provide information technology tools and continuous monitoring as a services (CMaaS) to combat threats on government civilian networks. The program aims to establish a common set of tools and services in place aligned with national and industry standards. These capabilities would enable agencies to be more responsive to network anomalies.

The core capabilities for DHS’s continuous monitoring fell into five areas: hardware asset management, software asset management, vulnerability management, configuration management, and anti-virus. The continuous monitoring program outlined several approaches, including a service-based solution. These CMaaS solutions will be based upon NIST standards including a number of guidelines set out in NIST’s 800 series of special publications.

DHS received $183 million from Congress in 2013 to support financing this effort for many agencies.  Although Blanket Purchase Agreements (BPAs) were awarded out of DHS’s Office of Cybersecurity and Communications Continuous Diagnostics and Mitigation Program, the contracts will be run by the General Services Administration (GSA). Earlier this month, the momentum reportedly stalled. Vendors expecting the release of a request for quote (RFQ) under the contract last week are continuing to wait for further action. The RFQ was expected to address agencies’ requirements for information technology inventory management tools for both hardware and software.

In the meantime, new information security policy from OMB is in the works to clarify the types of systems and data that are monitored. Recent reports suggest that the policy has been ready for a few weeks but officials have withheld the release pending further review. At over ten pages, the policy is expected to offer comprehensive guidance for federal information system continuous monitoring (FISCM).

Amid numerous recent cancellations of government events, an upcoming CDM workshop was announced. Scheduled for November 6, 2013, the event will explore threats to federal systems, leveraging diagnostics and monitoring to improve information security, as well as use cases and examples.

Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on Twitter @FIAGovWin.

Balancing Security and Capability Remains Challenge for Mobile Adoption

The Mobile Work Exchange held its fall 2013 town hall meeting on September 12, 2013. The conference explored strategies for deploying a more mobile workforce, offering insight from over 20 speakers from both government and industry leadership.
In his opening address, the Bureau of Alcohol, Tobacco, Firearms, and Explosives’ Rick Holgate noted shifts in technology adoption over the last five to ten years. Holgate, the Assistant Director for Science & Technology and Chief Information Officer, cited findings from two surveys saying, “One thing I think we would all agree on is that the federal workforce is extremely optimistic about the productivity that mobility represents and the potential productivity gains.” Indeed, the impact of mobility spans various areas like productivity, transportation, and real estate. Potential savings estimates range from $12 to $14 billion per year in efficiencies. These untapped areas for efficiency mainly fall into two areas in areas related to increasing workforce productivity and consolidating real estate.
Along with increased mobile capabilities over the past 5 to 10 years, the work environment has evolved. These advances in mobility have introduced new challenges, particularly related to security and privacy. Referencing the Mobile Security Framework, Holgate applauded “agencies that have somewhat different security perspectives and baselines and ways of thinking about security” collaborating to establish a government-wide baseline for mobile security. Traditionally, guidance documents from the National Institute for Standards and Technology (NIST) have identified security controls but left it up to individual agencies to determine how to apply them. This baseline guidance allows agencies to make progress with mobile adoption efforts, particularly around shared mobile device management solutions.
The theme of security challenges continued throughout the day. In his luncheon keynote, the Air Force’s Major Linus Barloon described various issues he’s encountered related to information security. Challenges persist around identifying ways to improve prevention of security incidents, spill containment, and re-establishing security. Current technology has evolved to where previous approaches, like wiping machines and reintroducing them to computing environments, are no longer considered as effective.
Based on his experience, Barloon suggested that getting devices in the hands of users is only a quarter of the problem around mobility. Noting the numerous contract vehicles and acquisition mechanisms, Barloon observed, “It’s very easy to get that device into your users’ hands.” Once that’s achieved, however, questions arise about governance, extending to legal, ethical, and acceptable uses for devices. With the shift to mobile environments, issues emerge around translating and applying risk management frameworks to mobile devices, determining how to apply risk principles to these devices, and also defining how these devices will factor into continuous monitoring. It’s a balancing act, as Barloon described it. One the one hand, agencies aim to limit risk. On the other, they’re looking to increase operational capability.
In his closing, Holgate suggested the development of the next generation for the Digital Government Strategy is likely to assess agencies in terms of maturity of mobile adoption. This next step would also look to determine how to bring lagging organizations up to speed. Another area for development, Holgate noted, is in establishing metrics for program impact, especially in areas like workforce productivity and quality of citizen services.
The next Mobile Work Exchange session is scheduled for April 10, 2014. More information is available through the event site.
Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ@FIAGovWin.. Follow me on twitter 

Recapping the National Association of State Technology Directors (NASTD) Conference

As the 2013 National Association of State Technology Directors (NASTD) Conference wrapped up, both vendors and state IT officials may have left Charleston, S.C., with one message pounding in their heads: Watch out for storm clouds on the horizon.
Concerns over cybersecurity, employee retention and the pending roll out of FirstNet – the national public safety broadband initiative – dominated this year’s conversation as NASTD officials packed sessions with multiple speakers on each topic. Each subject has been more or less driven by a combination of current events and long-term trends.
The long-awaited wave of retiring baby boomers is finally underway and wreaking havoc on the ability of federal and state agencies to replace experienced personnel and retain institutional memory. After four years of planning and design, federal officials are getting ready to tally the number of states that will opt in to the federal FirstNet broadband plan and those that will build their own network. States received a wakeup call in October 2012 when nearly 4 million social security numbers and credit card data were hacked from South Carolina’s state government. The cyberattack brought to life the warnings that cybersecurity officials in the public and private sector have been quietly raising for years.
Most of the speakers opted to take an awareness approach and attempted to lay out the dire problems and statistics as plainly as possible; not because they were dodging the issues, but often because there are no obvious solutions to these problems. Besides, that wasn’t necessarily their job. Ultimately, these challenges are going to have to be addressed by the people who were sitting in the audience.
The dominant themes among these kinds of conferences for the past few years has been the recession, budget cuts and figuring out how to maintain service levels with fewer resources. The conversation has begun to shift, but the major themes of NASTD 2013 demonstrated that the end of one crisis often provides state IT officials with just enough breathing room to prepare for the next.
Cybersecurity in the age of cloud adoption and the mobile workforce will be one of the preeminent issues state and local governments deal with over the next 3-5 years. The volume and sophistication of attacks directed at state governments is rising at an alarming pace every year, which means that more state CIOs are going to be expected to pursue aggressive security strategies over the next few budget cycles. More attacks similar to the South Carolina hack will ensure that funding and budgets for these areas are robust. Dedicated network penetration and training for staff to help identify common phishing techniques and personnel security measures were two methods that most security officials stressed at the conference.
In the public safety realm, vendors should be on the lookout for another handful of RFIs dealing with FirstNet development and implementation. Whether a state opts in or out of the federal plan, the NTIA foresees a considerable amount of private sector involvement for this project over the next few years, which is good news for vendors nationwide.
For the full version of the National Association of State Technology Director's Conference Recap, click here (subscription required)

Future of Open Data Draws Attention to Innovation, Policy

Representatives from federal agencies, transparency organizations, legislative staffers, technology companies, and activists gathered this month to discuss the future of open data policies. Yet, some agencies have their own initiatives to encourage innovation underway.
On September 10, 2013, the Data Transparency Coalition hosted the nation’s first open data policy conference. The discussion examined the impact of open data and explored opportunities for new tools to streamline processes and target waste, fraud, and abuse. Meanwhile, the government is looking to release the follow on to its Open Government Action Plan, and the floor is open for comments and suggestions on encouraging public participation, improving management of public resources, and achieving effective collaboration to advance public services. Feedback submissions are requested by September 23, 2013.
Even as government, industry, and academia chart steps to open silos within the federal data portfolio, agencies are making strides to achieve great agility by leveraging data. In a recent interview, Dr. Sasi Pillay, Chief Technology Officer for IT, described several programs promoting innovation at the National Aeronautics and Space Administration (NASA). One focuses on engaging individuals outside of NASA. Another focuses on engagement inside the organization by connecting the right NASA employees and staff.
This past April, NASA’s Open Innovation projects completed an activity that engaged over 10,000 people around the world in a 48 hour period to work on 58 different problems. Many of these problems were developed and presented by NASA employees that had encountered challenges they’d been unable to resolve. Over 770 solutions were developed in that 48 hour period, which Pillay considers “a huge amount of success.” Pillay estimates the return on investment (ROI) for NASA was between $10 to 12 million for around $500,000 of investment including logistics and civil service time. The next step for the effort will be to explore the developed solutions internally to determine if they can be implemented.
NASA’s IT Labs program includes an annual poll for ideas and problems from within the organization. This is the 2nd year the activity has been completed, and so far 25 technologies have been identified for further investigation and development.  The aim is for some of these investments to go on to benefit the rest of NASA.
Pillay notes that the challenges of continued declining budget spur two approaches: either reduce services or increase investment in innovation. One key area that received further attention and investment at NASA is mobility. Last year, 20 partners helped to develop a strategy for aggressively embracing and adopting mobile capabilities at NASA. Going forward the agency has 30 actionable small projects, some of which include Bring Your Own Device (BYOD) efforts. To this end, NASA is taking an approach that emphasizes securing data and applications rather than endpoint devices. The focus marks a departure from past approaches that sought to save money by standardizing end-point devices. Now, NASA hopes to accommodate a variety of devices and implement strategies that make information accessible while addressing data and security layer requirements.
In recent years, early investment and aggressive adoption of new technologies has not come without setbacks. This has been especially true of technologies that have prompted agencies to approach and manage data in a new way. Cloud computing adoption, for example, has required continued development of strategies, evaluation of security controls, and operational evaluations. So while, some efforts hold the promise of cost savings or greater agility, it’s critical to ensure governance and adoption strategies are mature before rushing ahead.
For more information on the second National Action Plan (NAP 2.0) and details on submitting responses visit the Open Gov blog.
Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.

APCO International 2013 Conference Recap

The 2013 Association of Public-Safety Communications Officials (APCO) conference was held August 18-21 at the Anaheim Convention Center in Anaheim, Calif. The conference included educational training sessions on a variety of important topics that are essential to the public safety communications industry. The training sessions covered nine different tracks: Frontline Telecommunicator; Supervision and Leadership Development; Communications Center Management; Regulatory and Legislative Issues; Emergency Preparedness, Response and Situational Awareness; Radio Technologies, LMR, Spectrum Management; Emerging Technologies and Applications; Current Event and Hot Topics; and NG911 and New Response Technology.

Keynote speakers tied their addresses with themes of this year’s conference: connect, innovate and accelerate. Author and adventure-seeker Erik Weihenmayer spoke about the challenges he’s faced as a blind man and the mountains he’s had to climb, literally.

At the 2012 APCO conference, FirstNet was just coming into fruition with a 12-member FirstNet board being announced. Several months after the event, the entire industry took part in a notice of inquiry for the FirstNet technical architecture. Since receiving more than 100 responses last year, FirstNet and its board have made great strides in turning the idea of a national broadband network into a reality. Initial funding has been allocated in the amount of $7 billion, along with $135 million for a new state and local implementation grant program that will be administered by the National Telecommunications and Information Administration (NTIA).

With a discussion of FirstNet came a lot of sessions on LTE and Land Mobile Radio (LMR). The rise in technological advancements is clear when looking at the features of LMR and LTE, and the provision of new standards and ways to package information with LTE will only increase in the years ahead.

Last month, Deltek and the Industry Council for Emergency Response Technologies (iCERT) published the FY 2014 Justice and Public Safety Market Overview, which provides a comprehensive look into spending within the industry since FY 2010 and a forecast for the upcoming fiscal year. Buried in the deep-dive analysis are some key takeaways that provide a glimpse into what government officials and vendors at APCO 2013 are facing.

GovWinIQ subscribers can  read the full Deltek recap of the APCO 2013 Conference. Non-Subscribers can gain access with a free GovWinIQ trial.



NIST Big Data Working Group Hits Homestretch to Roadmap Draft

The National Institute of Standards and Technology (NIST) Big Data Working Group kicked off this past June. Setting an ambitious timeline for a number of deliverables, the group expects to complete working drafts by the end of September 2013.
During the most recent session, the group outlined the deliverables which will be comprised of working drafts for:
·          Big Data Definitions
·          Big Data Taxonomies
·          Big Data Requirements
·          Big Data Security and Privacy Requirements
·          Big Data Architectures Survey
·          Big Data Reference Architectures
·          Big Data Security and Privacy Reference Architectures
·          Big Data Technology Roadmap
The work behind these deliverables is being addressed by five subgroups, which will all contribute to the Big Data Technology Roadmap.
Definitions and Taxonomies
The initial definition for Big Data was proposed in January 2013 at the NIST Cloud/Big Data workshop. This definition states that “Big Data” refers to digital data volume, velocity and/or variety that:
• enable novel approaches to frontier questions previously inaccessible or impractical using current or conventional methods; and/or
• exceed the storage capacity or analysis capability of current or conventional methods and systems.
• differentiates by storing and analyzing population data and not sample sizes.
The group proposes a definition for Data Scientists, described as “a practitioner who has sufficient knowledge of the overlapping regimes of expertise in domain knowledge, analytical skills and programming expertise to manage the analytics process through each stage in the big data lifecycle. They handle value, veracity (quality), etc.”
Requirements and Use Cases
Use case templates are being developed to capture various aspects of requirements including:
·          Goals, Description
·          Data Characteristics, Data Types
·          Data Analytics
·          Current Solutions
·          Security & Privacy
·          Lifecycle Management and Data Quality
·          System Management and Other issues
To date, 45 use cases have been received. These use cases have been organized into seven categories are available for review on the working group’s site. These categories and the number of uses cases are as follows:
·          GovernmentOperation (2)
·          Commercial(8)
·          Healthcare and Life Sciences(10)
·          Deep Learning and Social Media (6)
·          Ecosystem and Research (4)
·          Astronomy and Physics(5)
·          Earth, Environmental and Polar Science (10)
Security and Privacy
The requirements scope for big data security includes infrastructure security, data privacy, data management, as well as integrity and reactive security. Uses case studied for security and privacy issues covered retail (consumer), healthcare, media (social media and communications), government (military, justice systems, etc.), and marketing scenarios.
The Technology Roadmap subgroup will incorporate input from the other subgroups (Definitions and Taxonomies, Requirements and Use Cases, Security and Privacy, and Reference Architecture). This group will also explore potential standards related to big data activities, capabilities and technology readiness, a decision framework, big data mapping and gap analysis, and strategies for adoption and implementation.
The subgroups will aim to complete and discuss their final draft on September 25th. Then, the materials will be presented and discussed at NIST’s Big Data Workshop on September 30th. Registration for the event is free; however, there is a cut-off date on September 23, 2013. More information is available on the event site
Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.

DoD Targets Rapid Mobile Technology Review and Approval Process

The Defense Department (DoD) supports approximately 600,000 smartphone users, and they are pursuing a strategy to support a broader ranges of devices. Recently, at the annual Forecast to Industry from the Defense Information System Agency (DISA), mobility played a dominant role in discussion.  In particular, goals stressed streamlining the review process for commercial products.
DISA presentations depicted a comprehensive mobility concept including capabilities for Voice/VoIP, email, texting, calendar, automation capabilities, unified communications, telecom expense management, mission partner applications, secure access to the Department of Defense Information Network (DODIN, formerly the Defense Information Systems Network, or DISN), and device security. The vision also includes a mobile app store and enterprise Mobile Device Management.
            Source: DISA                   


Historically, it has taken anywhere from nine months to a year for new mobile devices, mobile applications and operating systems to complete the DoD review process. Often, those technologies are outdated by the time they achieve approval. Jennifer Carter, the component acquisition executive at DISA, described one of the process challenges, saying, “The traditional DoD cycle times do not meet what is needed to get these capabilities out to the warfighter, and we don’t want to be where by the time we issue the device it’s obsolete and … you have to buy it on eBay.” The address this lag, DoD is partnering with industry to achieve more rapid deployment of commercial technologies by streamlining review and approval cycles. These goals will include 30 day turn around cycles for new hardware, new applications, and new operating systems.

Of the 600,000 smartphone users in the DoD, 470, 000 use BlackBerry handsets and 130,000 are piloting iPhone and Android devices for security trials. Back in May, DoD approved the use of Samsung’s hardened version of Android (Knox) in smartphones and BlackBerry 10 devices. The Knox took a noteworthy approach to the Security Technical Implementation Guides (STIG) by proactively considered the DoD’s security requirements.
The discussion also called out support needed from industry to close a number of gaps. Moving forward, DISA will be looking for
·          security built into products,
·          alignment with NSA protection profiles,
·          enterprise license agreements for commercial applications,
·          enterprise based cost models, and
·          continued advancement of enabled secure mobile applications.
Mobility contracts opportunities on the horizon include gateway procurement and enterprise solutions for mobile applications. The gateway request for proposals (RFP) is anticipated during the first quarter of FY 2014. This will be a single award for a firm-fixed price contract. The request for information (RFI) for mobile applications solutions is also expected during the first quarter of FY 2014.
Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.

NIST Starts Work on Big Data Roadmap

Close to 90 attendees participated in the National Institute of Standards and Technology (NIST) Big Data Working Group meeting on Wednesday, June 26, 2013. Led by Wo Chang, a digital data advisor with NIST’s Information Technology Laboratory, the session laid the ground work for ambitious collaboration to provide several deliverables by the end of September 2013.
NIST is leading development of a Big Data Technology Roadmap, which will define and prioritize requirements for interoperability, reusability, and extendability for big data analytic techniques as well as technology infrastructure to support secure and efficient adoption. To help develop these ideas, NIST has formed a public working group to form a community of interest from industry, academia, and government.
The goals for the NIST Big Data Working Group (NBD-WG) are to develop definitions, taxonomies, reference architectures, and a technology roadmap to enable innovation. In fact, the four deliverables will be comprised of:
·          Big Data Definitions
·          Big Data Taxonomy
·          Big Data Reference Architecture, and
·          Big Data Technology Roadmap
Individual topics will be addressed separately by five subgroups:
  • Big Data Definitions and Taxonomies Subgroup 
  • Big Data Reference Architecture Subgroup 
  • Big Data Requirements Subgroup 
  • Big Data Security and Privacy Subgroup 
  • Big Data Technology Roadmap Subgroup 
For example, use cases and use case requirements will be treated by a subgroup reflector, enabling collaboration discussions via email. Then, progress and findings will be presented at the working group level during weekly meetings. Once co-chairs for the subgroups are named, they will help drat the subgroup charters and identify options for meeting logistics and coordination. All of the subgroup will be contributing to the Big Data Technology Roadmap. Taxonomy identifying the key components and players will be tightly coupled with ontology, and both will help to shape the reference architecture. Requirements will include both use case and other technical requirements. Requirements for the reference architecture will be examined and developed to align with the taxonomy.
The goal of the Big Data Technology Roadmap will be to define central processes and approaches leading to efficiency gap analysis for Big Data tools and practices. Specific milestones will be determined once the working group has initiated regular meetings, which will occur weekly on Wednesdays. However, the target date for completion of an initial draft for the Big Data Technology Roadmap has been set for Friday, September 27, 2013.
The initial plan for development of the deliverables in this tight timeframe is as follows:
Participation in the working group is open to all interest parties. More information is available on the effort at the working group’s collaboration site:
Originally published for Federal Idustry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.

GSA Looks beyond Networx

Now that the government transition to Networx has been completed, the General Services Administration(GSA) is turning its attention to Networks Services 2020. According to Mary Davie, they’re not waiting for the next iteration of the program and are aiming to implement changes as soon as possible.  
The American Council for Technology Industry Advisory Council (ACT-IAC) hosted the launch of the Network Service 2020 (NS2020) Working Group. The event was organized by the Networks & Telecommunications (N&T) Special Interest Group (SIG) and included roundtable sessions for each of four committees:
GSA/Vendor Operations
Program Development, Goals and Metrics
Business Growth and Collaboration
The NS2020 Working Group aims to provide an ongoing forum for government and industry discussion on topics related to the entire lifecycle of the NS2020 program.
GSA and Vendor Operations explored service ordering, portal and pricer, vendor contract operations, and effectiveness around terms & conditions.  Investigating government and industry operations highlighted the importance of inventory management. Transition success can be greatly impacted by good inventory.

The Program Development, Goals and Metricscommitteelooked attransaction costs, service level agreements (SLAs), success factors, value proposition models, and program management approach. Suggestions from this roundtable included offering positive incentives for providers to exceed SLAs, making it financially attractive for providers to get solutions in place rapidly (through the government covering up front costs), and improving inventory management.
Business Growth and Collaborationdiscussedstrategies for maximizing portfolio business, Small Business strategy, regional strategy, promoting competition, and collaboration. Three points that resulted from this topic are were the need for re-iteration of GSA’s value proposition, the need for better visibility into the services covered by the fees GSA charges, and the need for standardization (e.g. order placement, inventory, and billing).

The Technology committee addressed the current and future scope for products and services, innovation, and technology evolution. Needs recognized in discussion of capabilities not currently provided under Networx included ubiquitous access and cloud computing. The vision for the future this group described unlimited bandwidth, on demand, anytime, anywhere, with any device. Achieving this future would require more focus on services and less modification of commercial offerings.
The near term challenge for the working group is identifying where technology is heading and what services should be included in the scope of NS2020 (e.g. cloud computing, satellite services). Currently, GSA is working on determining what can be improved with the Networx program; for example, how inventory and service ordering can be made easier. At least for the time being, government is looking to get direction from industry.
The next ACT-IAC N&T SIG meeting will be held on June 27th to discuss cloud computing services.
Originally published for Federal Idustry Analysis: Analysts Perspectives Blog. Stay ahead of them competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.

More Entries