Passage of the Federal Information Technology Acquisition Reform Act (FITARA) in December 2014, as codified in the recent draft guidance released by the Office of Management and Budget (OMB), significantly expanded the responsibility and authority of Federal Chief Information Officers (CIOs). At the same time it increased requirements for transparency and accountability for how IT is implemented agency-wide.
In August, FierceGovernmentIT, in collaboration with the research firm Market Connections, published a series of articles examining Federal Agency adoption of modular acquisition/development strategies as described in OMB's 25 Point Plan and outlined in the June 2012 Memo, Contracting Guidance to Support Modular Develo
A visible trend happening today is the transition from owning proprietary data centers to outsourced data center operations. According to various market research reports, on an average, companies devote about 20% of their budgets regardless of usage on the installation and operation of data centers and IT equipment internally. Hence, companies are looking to optimize their budgets (or converting fixed operational costs to variable costs) by outsourcing their data center operations to specialist organizations that deal with this on a daily basis and end up paying based on usage.
I recently represented Octo Consulting by presenting a paper at an international research conference on digital government in Quebec. Despite the fact dgo 2013 was primarily a research conference, there were a number of public sector speakers and the combination of theory and practice resulted in the following observations from the papers and presentations on topics that ranged from “Make Data Actionable…” to “Community-Based Emergency Response.”
Octo attended the AFCEA/GMU Critical Issues in C4I Symposium from May 21st-22nd at George Mason University, where the key topics where the Joint Information Environment, Big Data, and Mobility.
We at Octo Consulting are passionate about Data, among many other things. For me, personally, nothing is a bigger asset for any organization, Government or otherwise, than Data. This week, myself and Dr.
Perhaps it was the Ides of March or some other phenomenon (March Madness is already reserved for another event starting today) that spurred a flurry of activity in Congress this week…we at Octo are tracking movement on several key pieces of legislation moving through Congress that have significant implications for our clients in the Federal IT community.
Continuing day 2's deep dive into SPARQL, we began day 3 talking about inference and the value of property paths. In the semantic web, by defining transitive properties, you can establish relationships across subjects. Consider the heirarchy:
Wednesday’s session was all about deep diving into SPARQL. The SPARQL Protocol and RDF Query Language (SPARQL) is a query language and protocol for RDF. SPARQL protocol describes a means for conveying SPARQL queries to an SPARQL query processing service and returning the query results to the entity that requested them. A SPARQL query processing service (endpoint) accepts queries and return results over HTTP. The fact that there is only one protocol for SPARQL is a huge advantage for semantic web, making it easier for people to connect to data sources using non-proprietary means.