Tag Archives: ABI

Newer versions of LTE to make rapid advances, ABI says

Emerging technologies for 4G LTE networks are expected to make rapid advances over the next few years, helping mobile networks keep up with data growth and bringing more users worldwide into the LTE fold.

By 2018, a majority of the world’s LTE subscriptions will be on networks that use either TD (time-division) LTE or features from the emerging LTE-Advanced standard, according to an ABI Research forecast released on Monday.

At the same time that mobile operators are still expanding infrastructure based on FD (frequency-division) LTE, the earliest version of the high-speed mobile system, the two more recent technologies are fast making inroads, according to ABI analyst Nick Marshall. They may dominate networks of large, outdoor “macro” cells by 2015, Marshall said.

TD-LTE uses one band of frequencies to send traffic both downstream and upstream, while FD-LTE uses separate, equal-size bands for the two directions. TD-LTE makes LTE possible in countries that license so-called unpaired spectrum. It also lets operators dedicate more capacity to downstream traffic, such as Web and video content, than to upstream traffic such as photo uploads.

To read this article in full or to leave a comment, please click here

…read more

Source: FULL ARTICLE at PCWorld

App Stores to Generate $25 Billion

By Steve Heller, The Motley Fool

Filed under:

Certain sectors of the economy remain extremely robust despite global macroeconomic challenges. In particular, the mobile app economy is expected to reach surpass $25 billion this year, according to ABI Research. Smartphones are expected to rake in $16.4 billion of app spending, leaving $8.8 billion for tablet app spending. By 2018, ABI expects that tablet app purchases will grow larger than smartphone app spending. In this video, Motley Fool contributor Steve Heller discusses the details of this report and which companies are likely in the best position to benefit. The two big winners? Apple and Google

There’s a debate raging as to whether Apple remains a buy. The Motley Fool‘s senior technology analyst and managing bureau chief, Eric Bleeker, is prepared to fill you in on both reasons to buy and reasons to sell Apple, and what opportunities are left for the company (and your portfolio) going forward. To get instant access to his latest thinking on Apple, simply click here now.

var FoolAnalyticsData = FoolAnalyticsData || []; FoolAnalyticsData.push({ eventType: “TickerReportPitch”, contentByline: “Steve Heller“, contentId: “cms.30725”, contentTickers: “NASDAQ:AAPL, NASDAQ:GOOG”, contentTitle: “App Stores to Generate $25 Billion”, hasVideo: “True”, pitchId: “1”, pitchTickers: “NASDAQ:AAPL”, …read more

Source: FULL ARTICLE at DailyFinance

Exelis Wins Korean Weather-Sat Contract

By Rich Smith, The Motley Fool

Filed under:

Turns out Americans aren’t the only people who want to know what the weather’s going to be like tomorrow. On Monday, aerospace and defense firm Exelis announced that it has won a “multimillion dollar” contract to build an advanced geostationary weather imaging satellite for South Korea.

The Advanced Meteorological Imager, built as part of South Korea‘s GEO-KOMPSAT-2A program, is said to be a “Korean version of the Advanced Baseline Imager (ABI) Exelis is currently building for the U.S. National Oceanic and Atmospheric Administration and NASA,” and an analog to our GOES-R series of satellites. Exelis is currently in the process of building seven of these ABI-class satellites — four for the U.S., two for Japan, and now one for South Korea.

These satellites hold position 22,300 miles above earth, tethered to their locations so that they’re able to constantly monitor specific regions of the Earth’s surface, rather than orbiting around the globe. South Korea wants its satellite, in large part, to improve its ability to keep track of locally forming typhoons and similar severe weather patterns.

Exelis shares rose 0.4% in Monday trading, closing at $11.16. 

The article Exelis Wins Korean Weather-Sat Contract originally appeared on Fool.com.

Fool contributor Rich Smith and The Motley Fool have no position in any of the stocks mentioned. Try any of our Foolish newsletter services free for 30 days. We Fools don’t all hold the same opinions, but we all believe that considering a diverse range of insights makes us better investors. The Motley Fool has a disclosure policy.

Copyright © 1995 – 2013 The Motley Fool, LLC. All rights reserved. The Motley Fool has a disclosure policy.

(function(c,a){window.mixpanel=a;var b,d,h,e;b=c.createElement(“script”);
b.type=”text/javascript”;b.async=!0;b.src=(“https:”===c.location.protocol?”https:”:”http:”)+
‘//cdn.mxpnl.com/libs/mixpanel-2.2.min.js’;d=c.getElementsByTagName(“script”)[0];
d.parentNode.insertBefore(b,d);a._i=[];a.init=function(b,c,f){function d(a,b){
var c=b.split(“.”);2==c.length&&(a=a[c[0]],b=c[1]);a[b]=function(){a.push([b].concat(
Array.prototype.slice.call(arguments,0)))}}var g=a;”undefined”!==typeof f?g=a[f]=[]:
f=”mixpanel”;g.people=g.people||[];h=[‘disable’,’track’,’track_pageview’,’track_links’,
‘track_forms’,’register’,’register_once’,’unregister’,’identify’,’alias’,’name_tag’,
‘set_config’,’people.set’,’people.increment’];for(e=0;e<h.length;e++)d(g,h[e]);
a._i.push([b,c,f])};a.__SV=1.2;})(document,window.mixpanel||[]);
mixpanel.init("9659875b92ba8fa639ba476aedbb73b9");

function addEvent(obj, evType, fn, useCapture){
if (obj.addEventListener){
obj.addEventListener(evType, fn, useCapture);
return true;
} else if (obj.attachEvent){
var r = obj.attachEvent("on"+evType, fn);
return r;
}
}

addEvent(window, "load", function(){new FoolVisualSciences();})
addEvent(window, "load", function(){new PickAd();})

var themeName = 'dailyfinance.com';
var _gaq = _gaq || [];
_gaq.push(['_setAccount', 'UA-24928199-1']);
_gaq.push(['_trackPageview']);

(function () {

var ga = document.createElement('script');
ga.type = 'text/javascript';
ga.async = true;
ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';

var s = document.getElementsByTagName('script')[0];
…read more

Source: FULL ARTICLE at DailyFinance

ITT Exelis advanced weather imager technology to improve forecasting capabilities in South Korea

By Business Wirevia The Motley Fool

Filed under:

ITT Exelis advanced weather imager technology to improve forecasting capabilities in South Korea


Contract highlights company’s international and commercial interests

ROCHESTER, N.Y.–(BUSINESS WIRE)– ITT Exelis (NYS: XLS) has been awarded a multimillion dollar contract to provide South Korea an advanced geostationary weather imager to support the country’s forecasting capabilities.

Under the GEO-KOMPSAT-2A program, Exelis will deliver an Advanced Meteorological Imager (AMI), which will be launched into geostationary orbit in 2017. The AMI is a Korean version of the Advanced Baseline Imager (ABI) Exelis is currently building for the U.S. National Oceanic and Atmospheric Administration and NASA for the next-generation Geostationary Operational Environmental Satellite series known as GOES-R.

AMI will provide South Korea more data more regularly and at higher resolution resulting in better advanced warning, which is critical for saving lives and property,” said Rob Mitrevski, an Exelis Geospatial Systems vice president who leads environmental intelligence and integrated geospatial sensors and systems. “Recent hurricanes and major storms have shown the critical role played by geo-imagers here in the United States. South Korea similarly has challenges with typhoons and other severe weather and will benefit greatly from this new geostationary imager.”

Geostationary imagers fly 22,300 miles above Earth staring at specific regions, providing constant, near real-time data to weather forecasters. Known as sentinels in the sky, these satellite instruments are critical to short-term and immediate severe weather forecasting. Geo-imagers capture most of the images of hurricanes and storms taken from space, which are shown by meteorologists on television and in other media. The ABI class imager being used by South Korea provides five times the temporal resolution than current imagers, completing a scan of the full hemisphere in five minutes rather than the 30 minutes.

The AMI will also provide several spectral bands and two times the resolution capability of South Korea‘s existing satellites to about one-half mile. These increased capabilities and higher latency will provide new products and tools for weather forecasters to improve their forecasts.

With the addition of the GEO-KOMPSAT-2A program, Exelis is now building seven ABI class instruments: four for NOAA and NASA and two for Japan. Exelis has provided every geostationary imager and sounder to the U.S. government since 1994 and also built the current geo imagers flown by Japan and South Korea.

…read more

Source: FULL ARTICLE at DailyFinance

Muon Suite 2.0.0 released

I am proud to announce the first alpha release for Muon Suite 2.0. The Muon Suite is a set of package management utilities for Debian-based Linux distributions built on KDE technologies.Packages for Kubuntu 12.10 “Quantal Quetzal” are available in the QApt PPA.

2.0? A rewrite?

Nope! I had to make changes to LibQApt that would prevent programs compiled against LibQApt 1.x from being able to run or compile against LibQApt 2.x. Muon Suite 2.0 simply means that it uses LibQApt 2.0. (In developer speak, this release breaks both ABI and API. It’s mostly source-compatible, but will require a few changes/additions in programs using LibQApt) I’ll write a separate post about LibQApt2 explaining the changes in detail. Most of my efforts this cycle have been towards LibQApt2, but this doesn’t mean that there’s nothing new on the Muon front. (In fact quite a bit of work was done simply with the port to QApt 2.0)

KNewStuff3 support

Both the Muon Software Center and Muon Discover now support installing things via KDE’s KNewStuff framework version 3. This is the framework that allows developers to publish scripted plugins such as Plasma widgets to the world. Currently the Muon Software Center and Muon Discover have categories for Plasma widgets (as well as plugins for the Comics plasmoid) utilizing KNS3. Suggestions for further categories using KNS3 are welcome.

Aleix wrote about this feature in detail at hisblog. As he wrote, the work in supporting multiple resource types opens up the possibility of new backends. (Perhaps a backend that grabs data from AppStream in the future) Exciting stuff.

Muon Discover UI Improvements

A lot of work has gone in to improving the user interface of Muon Discover by my colleaguesAurélien Gâteau andAleix Pol Gonzalez. Muon Discover now integrates much better with the rest of KDE, and is in general easier to use.


Changelogs

Detailed changelogs for LibQApt and Muon can be foundhereandhere, respectively.

Plans for 2.1

Even though 2.0 has just been released, we’ve had some things on the back burner waiting for 2.1 that are already done. A plan for the feature set of 2.1 can be found here.

…read more
Source: FULL ARTICLE at Planet KDE

QtWebKit 2.3.0 is out

KDE Project:

Good news everybody!

QtWebKit 2.3.0 was tagged in gitorious yesterday with the tarball available there as well

For those of you that don’t know: QtWebKit 2.3 is a port of QtWebKit from Qt 5 to Qt 4.8. It has most of the web-facing features, stability fixes and performance improvements that QtWebKit in Qt 5 has, it has skipped anything Qt 5 specific such as QQuickWebView, but has almost all improvements on the WebKit1 side (QWebView). QtWebKit 2.3 also maintains API and ABI compatibility with QtWebKit 2.2 from Qt 4.8, and is thereby an easy drop-in replacement. The released version 2.3.0 has roughly the same WebKit version and patches as Qt 5.0.2.

Note that QtWebKit 2.3 is not an official Qt release, nor will be. We recommend users to upgrade to Qt 5, but for those stuck with Qt 4.8 for the time being and who is using QtWebKit, I would personally recommend trying out QtWebKit 2.3. I have also had great feedback from the developers of the Rekonq, and Qupzilla browsers, that also recommend users to try out 2.3, several distributions are either packaging or planning to package it, the first being Arch Linux who has been packaging QtWebKit 2.3 since the betas.

Note that the sources are only buildable using the build-webkit tool, and requires the QTDIR environment set even if only to /usr. The basic build-command is “Tools/Scripts/build-webkit –qt –release –no-webkit2″. If you are packaging to x86, you might also want to add –no-force-sse2 since the library would otherwise default to using SSE2 math. Additionally you can use –qmakearg=”CONFIG+=production_build” to link with less memory. Finally I have added an option to include WebP support by adding DEFINES+=HAVE_LIBWEBP=1 to the qmakearg. After building the you need to go to WebKitBuild/Release and run make install.

If you have any questions, you can check my earlier posts about QtWebKit 2.3, ask on webkit-qt@lists.webkit.org mailing list or catch me on FreeNode IRC #qtwebkit.

…read more
Source: FULL ARTICLE at Planet KDE

Enterprise Expert Jason McNicol Joins ABI Research

By Business Wirevia The Motley Fool

Filed under:

Enterprise Expert Jason McNicol Joins ABI Research

OYSTER BAY, N.Y.–(BUSINESS WIRE)– ABI Research announces that enterprise authority Jason McNicol has joined the company in the position of senior analyst. In his role at ABI he will primarily focus on enterprise mobility management services and enterprise applications for smartphones and tablets. His research responsibilities will also include assessment of vertical markets, enterprise 4G services, and demand and supply side business model analysis.

Before joining ABI Research, Jason was a Senior Research Marketing Analyst at Cox Communications where he was responsible for deep market segmentation and company performance/forecasting analyses. Other experience includes assisting local start-ups with business plan assessments and teaching various university level business courses. Jason holds a BBA from Texas Tech University, and MBA and PhD from University of Texas at El Paso.

Enterprise practice director Dan Shey comments, “In a market with a complex array of devices, applications and services, Jason’s expertise will help suppliers and businesses identify the right opportunities to become more connected, smarter, and mobile.”

For more information about enterprise services covered by Jason, please visit ABI Research’s Enterprise Mobile Devices (http://www.abiresearch.com/research/service/enterprise-mobile-devices/), Enterprise Mobility Applications and Services (http://www.abiresearch.com/research/service/enterprise-mobility-applications-and-services/) and Mobile Enterprise Technologies Research Services (http://www.abiresearch.com/research/service/mobile-enterprise-technologies/) which include Research Analyses, Competitive Assessments, Insights, and Market Data products.

ABI Research provides in-depth analysis and quantitative forecasting of trends in global connectivity and other emerging technologies. From offices in North America, Europe and Asia, ABI Research’s worldwide team of experts advises thousands of decision makers through 70+ research and advisory services. Est. 1990. For more information visit www.abiresearch.com, or call +1.516.624.2500.

ABI Research
Christine Gallen, +1-516-624-2542
pr@abiresearch.com

KEYWORDS:   United States  North America  New York

INDUSTRY KEYWORDS:

The article Enterprise Expert Jason McNicol Joins ABI Research originally appeared on Fool.com.

Try any of our Foolish newsletter services free for 30 days. We Fools may not all hold the same opinions, but we all believe that considering a diverse range of insights makes us better investors. The Motley Fool has a disclosure policy.

Copyright © 1995 – 2013 The Motley Fool, LLC. All rights reserved. The Motley Fool has a disclosure policy.

(function(c,a){window.mixpanel=a;var b,d,h,e;b=c.createElement(“script”);
b.type=”text/javascript”;b.async=!0;b.src=(“https:”===c.location.protocol?”https:”:”http:”)+
…read more
Source: FULL ARTICLE at DailyFinance

Didier Roche: Unity: release early, release often… release daily! (part 1)

For almost the past 2 weeks (and some months for other part of the stacks), we have automated daily release of most of the Unity components directly delivered to Ubuntu raring. And you can see those different kinds of automated uploads published to the ubuntu archive.

I’m really thrilled about this achievement that we discussed and setup as a goal for the next release at past UDS.

Why?

The whole Unity stack has grown tremendously in the past 3 years. At the time, we were able to release all components, plus packaging/uploading to ubuntu in less than an hour! Keeping the one week release cadence by then was quite easy, even though a lot of work. The benefit was that it enabled us to ensure that we have a fluid process to push what we developped upstream to our users.

As of today, teams have grown by quite some extends, and if we count everything that we develop for Unity nowadays, we have more than 60 components. This covers from indicators to the theme engine, from the open input framework to all our tests infrastructure like autopilot, from webapps to web credentials, from lenses to libunity, and finally from compiz to Unity itself without forgetting nux, bamf… and the family is still growing rapidly with a bunch of new scopes coming down the pipe through the 100 scopes project, our own SDK for the Ubuntu phone, the example applications for this platform we are about to upload to Ubuntu as well… Well, you got it, the story is far from ending!

So, it’s clear that those numbers of components that we develop and support will only go higher and higher. The integration team already scaled by large extends their work hours and rush to get everything delivered timely to our user base[1]. However, it’s with no question that we won’t be able to do that forever. We don’t want as well introducing artificial delays to our own upstream on when we are delivering stuff to our users. We needed to solve that issue while not paying any price on quality, nor balancing the experience we deliver to our users. We want to keep high standards, and even, why not allying this need while providing an even a better, more reliable, and better evaluation before releasing of what we eventually upload to Ubuntu from our upstreams. Getting our cake and eat it too! :)

Trying to change for the better

What was done in the last couple of cycles was to separate between 2 groups the delivery of those packages to the users. There was an upstream integration team, which will hand over to the Ubuntu platform team theorically ready-to-upload packages, making the reviews, helping them, fixing some issues and finally sponsoring their work to Ubuntu. However, this didn’t really work for various reasons and we quickly realized that this ended up just complexifying the process instead of easing it out. You can see a diagram of where we ended up when looking back at the situation:

end of 12.04 and 12.10 release process

Seems easy isn’t it? ;) Due to those inner loops and gotchas, in addition to the whole new set of components, we went from a weekly release cadence to doing 4 to 5 solid releases during a whole cycle.

Discussing about it with Rick Spencer, he gave me a blank card on thinking how we can make this more fluid to our users and developers. Indeed, with all the work piling up, it wasn’t possible to release immediatly the good work that upstream did in the past days, which can lead to some frustration as well. I clearly remember that Rick used the term “think about platform[2] as a service”. This kind of immediatly echoed in me, and I thought, “why not trying that… why not releasing all our code everyday and delivering a service enabling us to do that?”

Daily release, really simplified diagram

Even if not planned from the beginning (sorry, no dark, hidden, evil plan here), thinking about it, this makes sense as part of some kind a logical progression from where we started since Unity exists:

  • Releasing every week, trying manually to get the release in a reasonable shape before uploading to Ubuntu
  • Raise the quality bar, put some processes for merge reviewing.
  • Adding Acceptance Criterias thanks to Jason Warner, ensuring that we are getting some more and more good tests and formal conditions on doing a release
  • Automate those merges through a bot, ensuring that every commits in trunk builds fine, that unit tests are passing
  • Raise again the quality bar, adding more and more integration tests

Being able and ensuring we are able to release daily seems really, looking at it, par of the next logical step! But it wouldn’t have been possible without all those past achievements.

Advantages of daily releases

It’s quite immediate to see a bunch of positive aspects of doing daily releases:

  • We can spot way faster regressions. If a new issue arose, it’s easier to bisect through packages and find the day when the new regression or incorrect behavior started to happen, then, looking at the few commits to trunk (3-5?) that was done this day and pinpoint what introduced it.
  • This enables us to deliver everything in a rapid, reliable, predictable and fluid process. We won’t have crazy rushes as we had in the past cycles around goal dates like feature freezes to get everything in the hand of the user by then. This will be delivered automatically the day after to everyone.
  • I see also this as a strong motivation for the whole community who contributes to those projects. Not having to wait a random date hidden in a wiki for a “planned release date” to see the hard work you put into your code to be propagated to the user’s machines. You can immediately see the effect of your hacking on the broader community. If it’s reviewed and approved for merging (and tests passes, I’ll come back to that later), it will be in Ubuntu tomorrow, less than 24 hours after your work reached trunk! How awesome is that?
  • This also means that developers will only need to build the components they are working on. No need for instance to rebuild compiz or nux to do an Unity patch because the API changed and you need “latest everything” to build. Lowering the entry for contributing and chances that you have unwanted files in /usr/local staying around conflicting with the system install.

Challenges of daily release

Sure, this comes with various risks that we had to take into account when designing this new process:

  • The main one is “yeah, it’s automated, how can you be sure you don’t break Ubuntu pushing blindly upstream code to it?”. It’s a reasonable objection and we didn’t ignore it at all (having the history of years of personal pain on what it takes to get a release out in an acceptable shape to push to Ubuntu) :)
  • How to interact properly with the Ubuntu processes? Only core developpers, motus, and per-package uploaders have upload rights to Ubuntu. Will this new process in some way give our internal upstream the keys to the archives, without having them proper upload rights?
  • How ensuring packaging changes and upstream changes are in sync, when preparing a daily release?
  • How making useful information in the changelog so that someone not following closely upstream merges but only looking at the published packages in raring-changes mailing list or update-manager can see what mainly changed in an upload?
  • How to deal with ABI breaks and such transitions, especially when they are cross-stacks (like a bamf API change impacting both indicators and Unity)? How to ensure what we deliver is consistent across the whole stacks of components?
  • How do we ensure that we are not bindly releasing useless changes (or even an upload with no change at all) and so, using more bandwith, build time, and so on, for nothing?

How then?

I’ll detail much of those questions and how we try to address those challenges in the subsequent suite of blog posts. Just for now, to sum up, we have a good automated test suite, and stacks are only uploaded to Ubuntu if:

  1. their internal and integration tests are passing above a certain theshold of accepted failures.
  2. they don’t regress other stacks

Stabilizing those tests to get reliable result was a tremendous work for a cross-team effort, and I’m really glad that we are confident in them to finally enable those daily release.

On another hand, additional control is made to ensure that packaging changes are not committed without someone with upload right being in the loop for acking the final change before the copy to the distribution happens.

Of course, the whole machinery is not limited to this and is in fact way more complicated, I’ll have the pleasure the write about those in separate blog posts in the following days. Stay tuned!

Notes

[1] particularly around the feature freeze

[2] the Ubuntu platform team here, not Ubuntu itself ;)

Source: FULL ARTICLE at Planet Ubuntu