GovWin
B2G is moving!
Blogs posted after May 22, 2015 will be located on Deltek's central blog page at www.deltek.com/blog.
Just select the "B2G Essentials" blog to continue to receive this valuable content.
NIST Big Data Working Group Hits Homestretch to Roadmap Draft

The National Institute of Standards and Technology (NIST) Big Data Working Group kicked off this past June. Setting an ambitious timeline for a number of deliverables, the group expects to complete working drafts by the end of September 2013.
 
During the most recent session, the group outlined the deliverables which will be comprised of working drafts for:
·          Big Data Definitions
·          Big Data Taxonomies
·          Big Data Requirements
·          Big Data Security and Privacy Requirements
·          Big Data Architectures Survey
·          Big Data Reference Architectures
·          Big Data Security and Privacy Reference Architectures
·          Big Data Technology Roadmap
 
The work behind these deliverables is being addressed by five subgroups, which will all contribute to the Big Data Technology Roadmap.
Definitions and Taxonomies
The initial definition for Big Data was proposed in January 2013 at the NIST Cloud/Big Data workshop. This definition states that “Big Data” refers to digital data volume, velocity and/or variety that:
• enable novel approaches to frontier questions previously inaccessible or impractical using current or conventional methods; and/or
• exceed the storage capacity or analysis capability of current or conventional methods and systems.
• differentiates by storing and analyzing population data and not sample sizes.
 
The group proposes a definition for Data Scientists, described as “a practitioner who has sufficient knowledge of the overlapping regimes of expertise in domain knowledge, analytical skills and programming expertise to manage the analytics process through each stage in the big data lifecycle. They handle value, veracity (quality), etc.”
 
Requirements and Use Cases
Use case templates are being developed to capture various aspects of requirements including:
·          Goals, Description
·          Data Characteristics, Data Types
·          Data Analytics
·          Current Solutions
·          Security & Privacy
·          Lifecycle Management and Data Quality
·          System Management and Other issues
 
To date, 45 use cases have been received. These use cases have been organized into seven categories are available for review on the working group’s site. These categories and the number of uses cases are as follows:
 
·          GovernmentOperation (2)
·          Commercial(8)
·          Healthcare and Life Sciences(10)
·          Deep Learning and Social Media (6)
·          Ecosystem and Research (4)
·          Astronomy and Physics(5)
·          Earth, Environmental and Polar Science (10)
 
Security and Privacy
The requirements scope for big data security includes infrastructure security, data privacy, data management, as well as integrity and reactive security. Uses case studied for security and privacy issues covered retail (consumer), healthcare, media (social media and communications), government (military, justice systems, etc.), and marketing scenarios.
The Technology Roadmap subgroup will incorporate input from the other subgroups (Definitions and Taxonomies, Requirements and Use Cases, Security and Privacy, and Reference Architecture). This group will also explore potential standards related to big data activities, capabilities and technology readiness, a decision framework, big data mapping and gap analysis, and strategies for adoption and implementation.
 
The subgroups will aim to complete and discuss their final draft on September 25th. Then, the materials will be presented and discussed at NIST’s Big Data Workshop on September 30th. Registration for the event is free; however, there is a cut-off date on September 23, 2013. More information is available on the event site
 
Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.

DoD Targets Rapid Mobile Technology Review and Approval Process

The Defense Department (DoD) supports approximately 600,000 smartphone users, and they are pursuing a strategy to support a broader ranges of devices. Recently, at the annual Forecast to Industry from the Defense Information System Agency (DISA), mobility played a dominant role in discussion.  In particular, goals stressed streamlining the review process for commercial products.
 
DISA presentations depicted a comprehensive mobility concept including capabilities for Voice/VoIP, email, texting, calendar, automation capabilities, unified communications, telecom expense management, mission partner applications, secure access to the Department of Defense Information Network (DODIN, formerly the Defense Information Systems Network, or DISN), and device security. The vision also includes a mobile app store and enterprise Mobile Device Management.
 
 
            Source: DISA                   

 

Historically, it has taken anywhere from nine months to a year for new mobile devices, mobile applications and operating systems to complete the DoD review process. Often, those technologies are outdated by the time they achieve approval. Jennifer Carter, the component acquisition executive at DISA, described one of the process challenges, saying, “The traditional DoD cycle times do not meet what is needed to get these capabilities out to the warfighter, and we don’t want to be where by the time we issue the device it’s obsolete and … you have to buy it on eBay.” The address this lag, DoD is partnering with industry to achieve more rapid deployment of commercial technologies by streamlining review and approval cycles. These goals will include 30 day turn around cycles for new hardware, new applications, and new operating systems.

 
Of the 600,000 smartphone users in the DoD, 470, 000 use BlackBerry handsets and 130,000 are piloting iPhone and Android devices for security trials. Back in May, DoD approved the use of Samsung’s hardened version of Android (Knox) in smartphones and BlackBerry 10 devices. The Knox took a noteworthy approach to the Security Technical Implementation Guides (STIG) by proactively considered the DoD’s security requirements.
 
The discussion also called out support needed from industry to close a number of gaps. Moving forward, DISA will be looking for
·          security built into products,
·          alignment with NSA protection profiles,
·          enterprise license agreements for commercial applications,
·          enterprise based cost models, and
·          continued advancement of enabled secure mobile applications.
 
Mobility contracts opportunities on the horizon include gateway procurement and enterprise solutions for mobile applications. The gateway request for proposals (RFP) is anticipated during the first quarter of FY 2014. This will be a single award for a firm-fixed price contract. The request for information (RFI) for mobile applications solutions is also expected during the first quarter of FY 2014.
 
Originally published for Federal Industry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.

NIST Starts Work on Big Data Roadmap

Close to 90 attendees participated in the National Institute of Standards and Technology (NIST) Big Data Working Group meeting on Wednesday, June 26, 2013. Led by Wo Chang, a digital data advisor with NIST’s Information Technology Laboratory, the session laid the ground work for ambitious collaboration to provide several deliverables by the end of September 2013.
 
NIST is leading development of a Big Data Technology Roadmap, which will define and prioritize requirements for interoperability, reusability, and extendability for big data analytic techniques as well as technology infrastructure to support secure and efficient adoption. To help develop these ideas, NIST has formed a public working group to form a community of interest from industry, academia, and government.
 
The goals for the NIST Big Data Working Group (NBD-WG) are to develop definitions, taxonomies, reference architectures, and a technology roadmap to enable innovation. In fact, the four deliverables will be comprised of:
·          Big Data Definitions
·          Big Data Taxonomy
·          Big Data Reference Architecture, and
·          Big Data Technology Roadmap
 
Individual topics will be addressed separately by five subgroups:
  • Big Data Definitions and Taxonomies Subgroup 
  • Big Data Reference Architecture Subgroup 
  • Big Data Requirements Subgroup 
  • Big Data Security and Privacy Subgroup 
  • Big Data Technology Roadmap Subgroup 
For example, use cases and use case requirements will be treated by a subgroup reflector, enabling collaboration discussions via email. Then, progress and findings will be presented at the working group level during weekly meetings. Once co-chairs for the subgroups are named, they will help drat the subgroup charters and identify options for meeting logistics and coordination. All of the subgroup will be contributing to the Big Data Technology Roadmap. Taxonomy identifying the key components and players will be tightly coupled with ontology, and both will help to shape the reference architecture. Requirements will include both use case and other technical requirements. Requirements for the reference architecture will be examined and developed to align with the taxonomy.
 
The goal of the Big Data Technology Roadmap will be to define central processes and approaches leading to efficiency gap analysis for Big Data tools and practices. Specific milestones will be determined once the working group has initiated regular meetings, which will occur weekly on Wednesdays. However, the target date for completion of an initial draft for the Big Data Technology Roadmap has been set for Friday, September 27, 2013.
 
The initial plan for development of the deliverables in this tight timeframe is as follows:
Participation in the working group is open to all interest parties. More information is available on the effort at the working group’s collaboration site: http://bigdatawg.nist.gov/home.php
 
Originally published for Federal Idustry Analysis: Analysts Perspectives Blog. Stay ahead of the competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.

GSA Looks beyond Networx

Now that the government transition to Networx has been completed, the General Services Administration(GSA) is turning its attention to Networks Services 2020. According to Mary Davie, they’re not waiting for the next iteration of the program and are aiming to implement changes as soon as possible.  
The American Council for Technology Industry Advisory Council (ACT-IAC) hosted the launch of the Network Service 2020 (NS2020) Working Group. The event was organized by the Networks & Telecommunications (N&T) Special Interest Group (SIG) and included roundtable sessions for each of four committees:
GSA/Vendor Operations
Program Development, Goals and Metrics
Business Growth and Collaboration
Technology
 
The NS2020 Working Group aims to provide an ongoing forum for government and industry discussion on topics related to the entire lifecycle of the NS2020 program.
 
GSA and Vendor Operations explored service ordering, portal and pricer, vendor contract operations, and effectiveness around terms & conditions.  Investigating government and industry operations highlighted the importance of inventory management. Transition success can be greatly impacted by good inventory.

The Program Development, Goals and Metricscommitteelooked attransaction costs, service level agreements (SLAs), success factors, value proposition models, and program management approach. Suggestions from this roundtable included offering positive incentives for providers to exceed SLAs, making it financially attractive for providers to get solutions in place rapidly (through the government covering up front costs), and improving inventory management.
 
Business Growth and Collaborationdiscussedstrategies for maximizing portfolio business, Small Business strategy, regional strategy, promoting competition, and collaboration. Three points that resulted from this topic are were the need for re-iteration of GSA’s value proposition, the need for better visibility into the services covered by the fees GSA charges, and the need for standardization (e.g. order placement, inventory, and billing).

The Technology committee addressed the current and future scope for products and services, innovation, and technology evolution. Needs recognized in discussion of capabilities not currently provided under Networx included ubiquitous access and cloud computing. The vision for the future this group described unlimited bandwidth, on demand, anytime, anywhere, with any device. Achieving this future would require more focus on services and less modification of commercial offerings.
 
The near term challenge for the working group is identifying where technology is heading and what services should be included in the scope of NS2020 (e.g. cloud computing, satellite services). Currently, GSA is working on determining what can be improved with the Networx program; for example, how inventory and service ordering can be made easier. At least for the time being, government is looking to get direction from industry.
 
The next ACT-IAC N&T SIG meeting will be held on June 27th to discuss cloud computing services.
 
Originally published for Federal Idustry Analysis: Analysts Perspectives Blog. Stay ahead of them competition by discovering more about GovWinIQ. Follow me on twitter @FIAGovWin.

2013 IACP LEIM Conference Recap

The 2013 International Association of Chiefs of Police (IACP) Law Enforcement Information Management (LEIM) Conference was held in Scottsdale, Ariz., from May 20-23, 2013, at the Fairmont Scottsdale Princess. The event included a variety of plenary sessions and workshops addressing executive, operational, and technical communications and interoperability. The LEIM section of IACP provides information through a variety of events regarding best practices and updates on state-of-the-art law enforcement technology. More than 1,000 chiefs and information technology professionals are members of the LEIM section, including many from outside of the continental United States.
 
This year’s IACP LEIM did not feature a keynote speaker, so the plenary sessions kicked off quickly following the opening ceremony and welcoming remarks. The first of the two sessions, “The Evolving Role of Technology in Policing: Results of the IACP/LEIM IT Summit,” featured remarks from Scott Edson, commander at the Los Angeles Sheriff’s Department and Chair of the LEIM Board of Officers; Thomas Casady, Lincoln, Neb., public safety director; Lance Valcour, executive director of Canadian Interoperability Technology Interest Group (CITIG) and member of the LEIM Board of Officers; and Steve Williams, Major Florida Highway Patrol, CTO, and on the LEIM Board of Officers.
 
A common thread among many of the event’s speakers and sessions was that of using technology to your advantage and finding ways to “make it work” within your agency. One main issue, aside from costs of new technology, is the governance and local-legislator buy in.
 
For more analysis some of the hot topics and other insights from the 2013 IACP LEIM Conference, please check out the full five-page recap at GovWin.com

NIST to Hold Workshop Series on Cybersecurity Framework

Based on early reviews of the 2014 budget request, it appears agency efforts to improve cybersecurity will receive continued attention for the foreseeable future. Considering the As part of the executive order for cybersecurity, the National Institute for Standards and Technology (NIST) was given the responsibility for developing a cybersecurity framework. The first in a series of workshops on developing this “living framework” was held in Washington, D.C. on April 3, 2013. Much of the discussion revolved around risk management and the role of industry in identifying best practices. (Not surprisingly, these are issues that government agencies have been facing too.)

 
Mid March, we looked at the role of private industry in implementing the cyber executive order. For government, the goal of partnership with industry is to strengthen national security both within government and across private industry. To that end, the public sector has been reaching out for input from industry, academia and the public. As Rebecca Blank, Deputy Secretary for the Department of Commerce, phrased it in her opening comments: “Government cannot and should not do this alone.”
 
It’s clear that improved information sharing, situational awareness, and public-private partnership have roles to play in moving forward. For the most part, government and industry agree that there’s a need to build on existing capabilities, to identify solutions that provide flexibility and that can adapt across varying sector requirements.
 
For many companies, cybersecurity has become an integral part of discussion around risk-management practices. Opinions vary about how to define “best practice,” and rightly so. Organizations do not have a consistent answer for how to measure the success of security practices. For the most part, risk levels are evaluated at the tactical level, rather than compared to strategic benchmarks. Raising risk and security management to a strategic level would clarify its role in business strategy. During an industry leadership panel discussion. Patrick Gallagher, the Undersecretary of Commerce for Standards and Technology and Director of NIST, described this challenge as the need “to learn about the balance between good cybersecurity and good business.”
 
In all likelihood, the best practices captured in the framework will illustrate range of approaches to security implementation. This brings us to another sticky wicket: incentives. While there’s no certainty around the success another organization might have following another company’s lead, effective policies and procedures around risk management can contribute to a competitive position. There is no current barrier to sharing practices. So what is going to change? What will motivate the private sector to adopt new security standards voluntarily? What role can the government play to facilitate the exchange?
 
For starters, they’re asking for input. The Departments of Homeland Security, Commerce and Treasury are working together to report on industry incentives. The Commerce Department posted a Notice of Inquiry on incentives for getting industry involved in the framework development process. Public comments are open until April 29, 2013.
 
Beyond that, several multiday workshops are being scheduled. The next session will be hosted at Carnegie Mellon, held from May 19th through 31st. Other sessions will be held in July and September, further informing the framework. The first draft of the framework is due in October 2013, allowing 8 months from the release of the executive order for draft to be crafted.

NASW’s Social Work Month coming to a close

This year’s annual Social Work Month and its theme of “Weaving Threads of Resilience and Advocacy” are coming to a close. The month-long event, which is spearheaded by the National Association of Social Workers (

NASW), has been celebrated each year since the 1960s, and is an opportunity for communities nationwide to highlight the profession and the important contributions social workers make each day. 
 
NASW is the largest membership organization of professional social workers in the nation. Its mission is to enhance professional growth and development of its members, create and maintain professional standards, and advance sound social policies. To honor Social Work Month, Deltek is taking a look at how New Mexico has immensely improved its Child Support Enforcement Division.
 
In June 2012, New Mexico was recognized by the National Child Support Enforcement Association (NCSEA) for having the most improved child support enforcement program in the country. The award is determined through an extensive look at a state’s child support program performance over three years to ensure consistent, broad-based improvement. In that time, New Mexico improved its Paternity Establishment Percentage (from 54th in the nation to 29th). The state’s child support enforcement system, eChild, is a Web-based solution that works in conjunction with the existing state legacy mainframe. New Mexico contracted with Health Management Systems in June 2012 to provide child support enforcement customer service.
 
New Mexico’s Child Support Enforcement Division (CSED) continues to provide child support enforcement services to the general public, as well as recipients of Temporary Assistance for Needy Families (TANF) and Medicaid. The mission of CSED is to reduce the impact of poverty on people living in New Mexico by providing support services that assist families in breaking the cycle of dependency on public assistance.
 
To learn more about New Mexico and other social services-related projects throughout the country, check out Deltek’s Vertical Profiles. Non-subscribers can learn more about GovWin IQ and sign up for a free trial here.

Agencies Struggle to Handle Big Data Challenges

I had the opportunity recently to attend an excellent conference hosted by the Technology Training Corporation. This conference, called the “Government Big Data Symposium,” meets every year at the Holiday Inn in Arlington, Virginia. Organizer Marcus Min and his people do a fantastic job assembling a roster of government officials and industry experts to discuss big data challenges, solutions, and applications. This year’s symposium was solid as always, yielding a number of insights that help put attendees’ fingers on the pulse of big data projects and initiatives at federal agencies. Here are a few of the major themes discussed during the conference that I found interesting.
The Data Tsunami Continues to Grow
Anyone involved in either analyzing big data or in selling solutions feels this problem on a daily basis. Several of this year’s speakers emphasized that federal agencies with scientific missions are already at or past the point of Petascale computing. The challenge of handling this data has become acute at even relatively small agencies like the National Oceanic and Atmospheric Administration (NOAA). Dr. Mark Luker of the Networking and Information Technology Research and Development (NITRD) Program pointed out that NOAA’s data demands are compelling it to add 30 Petabytes of storage per year to archive its data. This massive inflow of data is only expected to increase.
Take the example of NOAA and apply it to larger agencies like the Department of Energy and National Aeronautics and Space Administration (NASA) and you will quickly see that the challenge of big data is not going away. This challenge presents a real business opportunity for vendors. Agencies are so reliant on data to accomplish their missions that storage vendors are in the enviable position of providing capacity that is not only desired, it is mission-critical. Similarly, those providing analytics are seeing an uptick in interest as agency personnel grapple with the problem of too much data. Finally, lest services vendors feel left out, agencies are in need of consulting services and data analysis services like never before as they try to understand how to incorporate the next generation of analytical tools into their IT environments.
The Changing Complexion of Solution Sets
Since discussion of big data arose a few years back it has become common to hear about the need for data scientists. Ideally these specialists would belong to an integrated team of professionals that parse and analyze data to enable valuable business decisions. This approach remains a best practice, but it presents federal agencies with a couple of significant challenges: a shortage of trained personnel and increased costs. Not only is the data scientist a rare breed that is in demand in both the public and private sectors, he/she also commands a good salary. In the current environment of fiscal austerity, finding and employing data scientists raises the bar for agencies seeking to invest in big data solutions.
Advancing technology is addressing this challenge, however, by providing alternatives that do not require specialized personnel to operate. Tableau would be one of these. As Sean Brophy of Tableau explained to me at the TTC Government Big Data Symposium, his company’s solution provides visualization capabilities for non-IT specialists, making it easy to use and reducing the need for agency spending on specialized personnel. I do not endorse one commercial solution or another, but it struck me that gearing solutions to non-specialists is the smart way to go for analytics vendors seeking to increase their share of the market in a fiscally constrained environment.
Cloud Computing and Big Data Come Together
Another common theme at the symposium was the growing nexus of cloud computing and big data solutions. Representatives from multiple agencies expressed interest in employing big data solutions in the cloud. NASA Chief Technology Officer, Dr. Sasi Pillay, emphasized that the agency is poised to significantly increase its investment in commercial cloud computing solutions. Michael Simcock, Chief Data Architect at Homeland Security (DHS) also said that his department is interested in making greater use of cloud for big data solutions. The only caveat was that the solution will be hosted in a private cloud. DHS will not use a public cloud for big data.
My impression from speaker comments is that the importance of the cloud for growth in federal big data investments cannot be understated. Cloud computing offers a relatively simple way to acquire the required solutions. Cloud computing can also scale up computing power on demand. For example, Dr. Nancy Grady of SAIC described a proprietary solution that automatically senses a processing load in the data queue and spins up (or down) the required number of machines to get the job done. Given the interest at federal agencies to acquire greater computing power on demand it sure looks like this will be an area of continued agency investment for years to come.

eHealth Initiative’s Annual Conference: moneyball and big data

As eHealth Initiative’s Annual Conference kicked off last Tuesday in Orlando, Fla., it was clear that the two-day event would offer a candid forum for health care leaders, state representatives and vendors to share the highs and lows, as well as what lies ahead in the ever-shifting world of health information technology.
 
eHealth Initiative Chief Executive Officer Jennifer Bordenick’s welcome address focused heavily on the perseverance the health care industry must have to achieve meaningful, lasting results. Despite major breakthroughs in innovation over the last several years, she said the industry is “starting to see some cracks around the edges.” Bordenick emphasized the uncertainty that surrounds health IT as well as the frustration felt by the private and public sector struggling to implement health care reform, ICD-10 and meaningful use amid strapped budgets and failed projects. She didn’t sugarcoat the introduction; the conference would be a place to discuss successes and pitfalls with “brutal honesty and humility.”
 
Still, Bordenick’s ultimate message was one of hope and encouragement. She took a few minutes to highlight a renowned group of political and cultural icons, all who experienced major defeats before breaking ground – Abraham Lincoln, Steve Jobs, Steven Spielberg, Michael Jordan and Jerry Seinfeld.
 
“It’s not about just getting it right,” said Bordenick. “It’s about perseverance – taking the time to get it right.”
  
In his keynote address, Optum’s Group Executive Vice President Andrew Slavitt discussed the opportunities and challenges of technology and big data. He said the good news is that the next generation of providers expects to use technology in their jobs instead avoiding it. The challenge is that geographic adoption of technology could get worse as more technologies are introduced, due to limited resources.
 
Slavitt said current health technologies are being built to fulfill the needs of the old world instead of the new, and emphasized the importance of understanding the needs of the dual eligible population – about 10 million Americans who qualify for both Medicare and Medicaid, and “whose health and social needs are at the root of almost everyone’s health care costs.”
 
According to Slavitt, the health care industry would benefit from adopting a moneyball approach to data analytics. For those unfamiliar with Michael Lewis’ book or the 2011 motion picture starring Brad Pitt, moneyball refers to how baseball’s Oakland Athletics used heavy analytics to assemble a team who went on to win a majority of games in 2001 on a limited budget. Instead of relying on typical batting averages and game statistics, the A’s professionalized data and analytics, and developed a data strategy.
 
Slavitt also noted that adoption and implementation of technology will accelerate once data speaks the language of margin and marketshare. He said with a moneyball approach and investment in health IT, keeping one in five dual eligible individuals in their homes instead of health institutions is an achievable goal, and one that would drastically reduce the $360 billion spent on their care.
 
In closing his address, Slavitt asked the crowd if big data was hype (more promise than will be realized), a fad (gone in five years), or a trend (true, lasting change). While the majority of the attendees believed big data to be a trend, Slavitt leaned toward hype. He said at this time, big data has a major lack of interoperability, standards, severity data, and linked data, all of which “hurts patient care” and “drains the usefulness out of data.” He said until a solution achieves these needed components, the risks of big data can be compared to “playing with fire.”
 
To read more about the eHealth Initiative Annual Conference, please read the full analyst recap.

The waiting game is over: States must act on Obamacare

“The law is the law, whether you like it or not. It doesn’t matter if you like it. It’s the damn law.” 

Many governors and insurance department heads awoke this morning with Mississippi Insurance Commissioner Mike Chaney’s words ringing true after last night’s reelection of President Obama. States holding out for a change in federal leadership on health reform now have fast decisions to make. The numbers are staggering for Mississippi: one in five people lack health insurance; it leads the “States of Misery” in health, poverty, and crime statistics; and has the highest level of obesity in the country at 34.9 percent. Despite Governor Phil Bryant calling for a stall on Obamacare, Chaney is creating a health insurance exchange (HIX) under his own authority, and with an Obama victory, plans to file a blueprint on November 16, unless he receives a court order from “some idiot out there trying to stop me.” Though his words could be considered somewhat crude, the logic behind them is solid: State’s ignoring the law does not mean the law disappears, and these words come from someone against Obamacare.

 

Exit polls from last night showed that roughly a third of voters listed health care as an important factor in their vote. Despite Obama being reelected, several states had voter efforts approved to limit Obamacare, including Missouri, Alabama, Wyoming, Florida, and Montana. Although some states were opposed to health care reform from the beginning, those that started the exchange planning process are finding that they have run out of time, and will likely adopt the federal exchange until a state-based exchange can be built.

 

With a scramble to hit the 2014 deadline, procurement strategies may be expedited, like Connecticut’s sole-source award to Deloitte for both its HIX and its integrated eligibility system. Expect to see even the early innovators relying heavily on federal hub resources for the first enrollment period. As Chaney pointed out, there is no more waiting; the Affordable Care Act is the law. Deltek will be watching as blueprints are submitted to the Center for Consumer Information and Insurance Oversight by November 16, 2012, and how federal-state relationships play out as the nation addresses health care reform.

 

As always, be sure to follow Deltek’s Health Care and Social Services Team on Twitter @GovWin_HHS, or connect with us through LinkedIN. Stay tuned for more information around a new Health Insurance Exchange Vertical Profile addition in the near future!

 

 

More Entries