Modern production agriculture has the potential to dramatically improve crop yields and reduce environmental impacts by enabling farmers to properly evaluate past, current and future farm management decisions through analysis of agronomic data generated in the field. However farmers are currently overwhelmed with walled gardens of incompatible data generated by their existing systems (geodata images, logs, reports, charts). Farmer’s want the hardware and software systems they use to interoperate – that is, to share information and be able to adequately rely on each other to help support decision-making. Learn more.
School in the Cloud allows learning to happen anywhere by supporting children all over the world to tap into their innate sense of wonder and ability to work together in Self Organised Learning Envrionments.
On Monday, Cisco unveiled an investment worth more than $1 billion to launch the world’s largest global Intercloud over the next two years. This open network of clouds will be hosted across a global network of Cisco and partner data centers featuring application process interfaces (APIs) for rapid application development, built out as part of an expanded suite of value-added application- and network-centric cloud services.
. . . . . . .
Riegel said the global Intercloud differentiates itself from Amazon Web Services in five ways. First, its focus will be on apps, not the infrastructure. Secondly, he explained, Cisco is the only company that can provide quality of service from the network all the way up to the application. Cisco has also designed the Intercloud for local data sovereignty in the post Eric Snowden era, where customers increasingly request that their data stay in their own home countries or the particular countries where they do business.
The fourth big difference is the Intercloud is firmly based on an open model, or open source innovation based on OpenStack. The last difference is the Cisco network will incorporate real-time analytics.
Amazon Web Services are a good way to store code and other data, but it can get a little pricey. Before you upload your stuff to the Amazon cloud, check out Heap’s article, “How We Estimated Our AWS Costs Before Shipping Any Code.” Heap is an iOS and Web analytics tool that captures every user interaction. The Heap team decided to build it, because there was not a product that offered ad-hoc analysis or analyzed an entire user’s activity. Before they started working on the project, the team needed to estimate their AWS costs to decide if the idea was a sustainable business model.
They needed to figure how much data was generated by a single user interaction, but then they had to find out where the data was stored and what to store it on. The calculations showed that for the business model to work a single visit would have to yield an average one-third of a cent to be worthwhile for clients.
CPU cores, compression, and reserve instances reduced costs, but there are some unexpected factors that inflated costs:
1. “AWS Bundling. By design, no single instance type on AWS strictly dominates another. For example, if you decide to optimize for cost of memory, you may initially choose cr1.8xlarge instances (with 244GB of RAM). But you’ll soon find yourself outstripping its paltry storage (240 GB of SSD), in which case you’ll need to switch to hs1.8xlarge instances, which offer more disk space but at a less favorable cost/memory ratio. This makes it difficult to squeeze savings out of our AWS setup.
2. Data Redundancy. This is a necessary feature of any fault-tolerant, highly available cluster. Each live data point needs to be duplicated, which increases costs across the board by 2x.”
Heap’s formula is an easy and intuitive way to calculate pricing for Amazon Cloud Services. Can it be applied to other cloud services?
iRevolution crossed the 1 million hits mark in 2013, so big thanks to iRevolution readers for spending time here during the past 12 months. This year also saw close to 150 new blog posts published on iRevolution. Here is a short selection of the Top 15 iRevolution posts of 2013:
I’ve been invited to give a “very provocative talk” on what humanitarian response will look like in 2025 for the annual Global Policy Forum organized by the UN Office for the Coordination of Humanitarian Affairs (OCHA) in New York. I first explored this question in early 2012 and my colleague Andrej Verity recently wrote up this intriguing piece on the topic, which I highly recommend; intriguing because he focuses a lot on the future of the pre-deployment process, which is often overlooked.
I’m headed to the Philippines this week to collaborate with the UN Office for the Coordination of Humanitarian Affairs (OCHA) on humanitarian crowdsourcing and technology projects. I’ll be based in the OCHA Offices in Manila, working directly with colleagues Andrej Verity and Luis Hernando to support their efforts in response to Typhoon Yolanda. One project I’m exploring in this respect is a novel radio-SMS-computing initiative that my colleagueAnahi Ayala (Internews) and I began drafting during ICCM 2013 in Nairobi last week. I’m sharing the approach here to solicit feedback before I land in Manila.
The “Radio + SMS + Computing” project is firmly grounded in GSMA’s official Code of Conduct for the use of SMS in Disaster Response. I have also drawn on the Bellagio Big Data Principles when writing up the in’s and out’s of this initiative with Anahi. The project is first and foremost a radio-based initiative that seeks to answer the information needs of disaster-affected communities.
China is having a light-bulb moment. Scientists from the Shanghai Institute of Technical Physics have discovered that a microchip embedded one-watt LED bulb is capable of emitting Wi-Fi, with enough signal strength to provide internet for four computers.
The discovery, aptly named “Li-Fi,” relies on the use of special LED light bulb that operate with light as the carrier instead of traditional radio frequencies.
Data rates as fast as 150 megabits per second were achieved with the new Li-Fi connection, making it faster, cheaper and more energy efficient than traditional Wi-Fi signals.
Li-Fi apparently only uses five percent of the energy required to power Wi-Fi-emitting devices, which rely on energy cooling systems to supply Internet to cell towers and Wi-Fi stations.
Though the discovery has huge potential in the way we use Internet connection, Li-Fi is still in a crude testing stage, since it doesn’t work if the light bulb is turned off or if light bulbs are blocked. That doesn’t seem like such a huge burden, though: it just means you’ll have to leave your lights on if you want to surf the Web. No more online shopping binges in the dark!
Li-Fi demonstrations will take place on November 5 in Shanghai at the International Industry Fair, where 10 kits will be tested out. A bright future seems to be in store for Li-Fi usage, which could range from using car headlights or focused light to transmit data, among many other potential applications.
People everywhere have been organizing a more ethical economy, but they work in relative isolation, fragmented by geography, sector, and even organizational form.
Many organizations collect information about a small piece of these efforts. In every situation, there is another organization for which that information overlaps. In every case there is an opportunity to share that will strengthen all the organizations participating.
Sharing requires effort, it requires trust, and it requires infrastructure. The Data Commons is a cooperative of organizations that are sharing – sharing the costs of this effort, trusting each other with their information, and building infrastructure to make sharing is easy.
Members of the Data Commons Cooperative are principled economic organizations that want it to be easy to share with each other, and with the world, in the movement for a more ethical economy.
The budget crunch is hitting everyone. IT departments are being asked to slim down and do more with less. Apparently the government is no exception. The affordability of open source has the government’s attention and is changing the content management and enterprise playing field. Read more about the changes in the Information Week article, “Feds Move To Open Source Databases Pressures Oracle.”
The piece begins:
“Under implacable pressure to slash spending, government agencies are increasingly embracing open source, object-relational database software at the expense of costly, proprietary database platforms. That’s putting new pressure on traditional enterprise software providers, including Oracle, to refine their product lineups as well as their licensing arrangements.”
So giants like Oracle are feeling the crunch, and it is trickling down throughout the proprietary world. But many organizations might not feel comfortable going completely open source, as in creating their own customized solution. So many are turning to a smart compromise, a value-added open source solution like LucidWorks. Customers get the affordability and agility of open source, but the support and expertise of an industry leader. Check out their support and services for assurance that going open source does not mean you will be left out on your own.
As technology advances quickly, so do security concerns. It stands to reason that new technologies open up new vulnerabilities. But open source is working to combat those challenges in an agile and cost-effective way. Read the latest on the topic in IT World Canada in their story, “Open-Source Project Aims to Secure Cloud Storage.”
The article begins:
“The open source software project named Crypton is working on a solution that would enable developers to easily create encrypted cloud-based collaboration environments. There are very few cloud services that offer effective encryption protection for data storage, according to Crypton. Security has always been the top concern for many enterprise organizations when it comes to cloud services and applications.”
It is reasonable that enterprises are concerned about security when it comes to cloud services and storage. For that reason, many prefer on-site hosting and storage. However, some open source companies, like LucidWorks, build value-added solutions on top of open source software and guarantee security as well as support and training. And while LucidWorks offers on-site hosting as well, those who venture into the Cloud can have the best of both worlds with cost-effective open source software and the support of an industry leader.
Back in 2003 visionary artist Anne-Marie Schleiner wrote an inspiring paper entitled “Fluidities and Oppositions among Curators, Filter Feeders and Future Artists” describing the future role of online curators as nature’s own filter feeders. Anne-Marie is clearly referring to curators to and filter feeder in art world, but her rightful intuitions are equivalently applicable to the larger world of information, data, digital and content curation as well.
But let me explain better.
First. The term “filter feeders” is used in nature to describe a group of animals which thrives on its ability to filter organic matter floating around them. From Wikipedia: “Filter feeders are animals that feed by straining suspended matter and food particles from water, typically by passing the water over a specialized filtering structure. Some animals that use this method of feeding are clams, krill, sponges, baleen whales, and many fish (including some sharks). Some birds, such as flamingos, are also filter feeders. Filter feeders can play an important role in clarifying water, and are therefore considered ecosystem engineers.” From Wikipedia: “In marine environments, filter feeders and plankton are ecosystem engineers because they alter turbidity and light penetration, controlling the depth at which photosynthesis can occur.”
Second. If you re-read this last sentence slowly and look at what it could mean if applied to the field of content curation, it would read to me something like this: “In large information ecosystems like the web, filter feeders/content curators and content itself are ecosystem engineers because they: a) directly influence our ability to inform ourselves effectively and to discern truth from false and useless info (turbidity) b) shed light and clarity on different subjects which would otherwise remain obscure (light penetration) c) determine our ability to make sense of our own generated information streams (photosynthesis).” A very inspiring parallel indeed, giving a way to visualize the true importance and role that curation, disenfranchised from the confines of museums and art galleries, could have on the planetary information ecosystem. Anne-Marie writes: “Most web sites contain hyperlinks to other sites, distributed throughout the site or in a “favorites” section. Each of these favorite links sections serves as a kind of gallery, remapping other web sites as its own contents. Every web site owner is thus a curator and a cultural critic, creating chains of meaning through association, comparison and juxtaposition, parts or whole of which can in turn serve as fodder for another web site’s “gallery.” Site maintainers become operational filter feeders, feeding of other filter feeders sites and filtering others’ sites. Links are contextualized, interpreted and “filtered” through criticism and comments about them, and also by placement in the topology of a site. The deeper a link is buried, the harder it may be to find, the closer to the surface and the frontpage, the more prominent it becomes, as any web designer can attest to. I am what I link to and what I am shifts over time as I link to different sites… … In the process, I invest my identity in my collection – I become how I filter.” Anne-Marie vision (2003), pure and uninfluenced by what we have seen emerge in the last few years, paints a very inspiring picture of the true role of content curators and of the key responsibility they do hold for humanity’s future. Inspiring. Visionary. Right on the mark. 10/10