Month: March 2013

Can RSS be a substitute for READER…??

Posted on

With Google reader gone, RSS likely to gain popularity…

Image

When tech giant Google announced that it was pulling the plug on Google Reader and the service won’t be available after July 1, 2013, it stirred up strong reactions. Google cited decline in usage of the RSS feed service, which was started way back in 2005, as the reason behind its decision to axe the service. While Google may want us to believe its version of the story, the buzz the news of the closure has generated on the social networks tells a different story.

 Badgering Google to change its decision, there has even been a petition directed to the White House, appealing the Obama government to request Google to reconsider its decision. It’s rather amusing and easy to laugh this off. However, the fact is, there may be some truth behind Google’s claim of declining usage, but it clearly doesn’t mean that there aren’t enough people who care.

 Google Reader made keeping up with everything you cared about on the Internet so much easier. It was much like getting a fix of your daily newspaper. You could subscribe to the RSS feeds of your favourite site, blog etc. and you would get a feed as soon as the site or the blog was updated. Additionally, you had the option to organise these feeds into folders, star the feeds, search, email and even share them on Google+. And what made it all the sweeter was the fact that it synced across devices, so the feeds that you have read will be marked read everywhere. Simply put, it was the best RSS feed reader out there.

However, there has been a shift in the way people consume information. Rather than subscribing to feeds and skimming through tons of information, what they seem to prefer is reading something that those in their network think they may like or is recommended. Then they can get the latest information first hand directly from the source, say, by following their favourite author on Twitter. And there is no question of missing out on anything big as the social networks will be quite abuzz.

 Though comparing the two will be foolish, the fact remains that today social networks are also emerging as a source of information with people pushing their content on these sites. Thanks to their popularity; today every site, blog and even brands are compelled to make their presence felt on the platform that is most popular amongst the masses. However, having said that, social networks will never be able to replace RSS. There are many users who crave for information, sans the fluff, and seek control on what they want to read, rather than being told what to read.  

 As a technology, RSS is still very much relevant. And the shutting down of Google Reader, which was no doubt amongst the best RSS feed services, shouldn’t be a cause of concern. RSS as a technology tool is still relevant and will continue to exist. Already there are tools and apps that offer RSS feed service; in fact, many of them used the Google Reader at the back-end. What they need to do now is to step up their offerings. Some of these tools – Feedly, Flipboard, Pulse, to name a few – have gained popularity as they have managed to grab the eyeballs by presenting feeds in a more appealing social manner.

Where Google also failed with Reader and received a lot of flak was when it limited the social aspects like following others and sharing links back in 2011. It was a step backwards, probably taken to promote Google+, which ended-up infuriating its loyal users. In all likelihood, this was the beginning of the end for Google Reader. 

What we need is a service that is able to leverage RSS to bring information to users, while at the same time have features that will make it relevant and appealing today. Digg has announced that it will soon be launching a service on the lines of Google Reader; let’s hope it manages to fill the void that Google Reader will leave behind.

 
Advertisements

How to build fortune by developing application…!

Posted on Updated on

How to earn by developing applications…!

Image

You’ve already beaten the odds — your app is successfully gaining adoption. But now a more difficult challenge looms: You need to parlay that app success into a viable business.

A successful apps-based business will demand many of the easy-to-overlook (but crucial) conventional business functions that go beyond pure app development and software maintenance such as customer service, PR and marketing, finance, operations, etc.

But let’s put those things aside for a moment and focus in on the biggest single challenge facing the developer of a successful app: Generating a sustainable revenue base from which to grow a business.

Here are three ways to do that:

#1 – Paid apps

Paid apps are an indispensable component to any revenue-generating efforts in the current app store environments.  Any developer offering a free app should strongly consider offering a paid version as well (and vice versa), even if alternate revenue generation strategies are in play.

There are a few reasons for this:

  • There is a large class of consumers who simply prefer paid apps. They directly seek paid apps via the app store categories and will download a paid app over an identical free one.
  • Paid apps can easily and effectively be directly promoted by the developer.  E-mail blasts to the free app user base, and interstitial notices within free apps can be a highly effective way to boost revenues and better, boost paid apps high in the rankings of the app stores. By utilizing these tools in bursts. developers can maximize organic revenue lift from the rankings jump as well.
  • App stores treat paid apps differently.  There are times when the app store editorial teams are looking for a paid app to promote (over a free one).
  • Temporary price drop promotions are easy to execute, and very effective.

Important points to consider with paid apps:

  • Consumer expectation of lifetime support: Even at a $.99 price point, consumers have come to expect a lifetime of value and support from an app developer.  Carefully consider what upgrades you are offering and how these are messaged to the consumer.  Paid apps are a one-time charge only, and consumers rarely expect to have to pay more to unlock additional value from a paid app or have a feature set change down the road.
  • App store dynamics favor lower price points.  If you have high-cost content or expensive features, figure out how to break them up into bite-sized chunks to keep prices low and revenues aligned with costs.
  • Free app downloads tend to be the best source of leads for paid app upgrades.  Make sure you can reach your free app users effectively and convert them to the paid app.  Do so in strategic bursts with other marketing initiatives for maximum efficacy.

#2 – In-App purchase

In-app purchases can be a great way to generate revenues from both free and paid apps.  Moreover, they can be used to effectively gather information on price and feature combinations that will sell most effectively, since product pricing and descriptive information can be modified on the fly if implemented properly.

Both one-time and subscription models can be realized via in-app purchase — although on certain platforms, auto-renewing subscriptions are only approved for use by certain types of applications.

For one-time in-app purchases, many of the same guidelines apply as with paid apps, although typically the promotional tactics that are used for paid apps are less effective when applied to in-app purchase (due to the limited number of ways to drive users to in-app purchase checkout flows).

Subscription models via In-App purchase represent a sustainable, renewable revenue stream for developers.  By building in time or usage-based product expiration, developers can build repeat business from customers over a longer time period.

This model requires intensive focus on repeat usage and conversion — without auto-renewal, the customer has to re-purchase the service at regular intervals, and achieving solid renewal rates can be difficult for even the most popular apps.

#3 – Advertising and sponsorships

Much has been written about the challenging state of mobile advertising. Inventory is outpacing demand, ad units are highly commoditized, and the upside for developers via indirect ad sales (ad networks) remains bleak as mobile eCPMs (effective cost per thousand impressions) have remained flat.

There are, however, some developers experiencing great success with mobile advertising, who tend to employ a combination of three key ingredients:

  • A unique and desirable audience
    If your app users are distinctive in some way versus typical smartphone users, they become more desirable for direct advertising or sponsorships.
  • Differentiated and targeted native ad “experiences”
    Due to the proliferation of banner advertising, distinctive ad experiences offer unique opportunities for brands and agencies, coupled with much higher response rates from consumers.  These can include rich media ads (i.e. audio/video), however, targeted “experiential” customized ad units that are unique to your app’s user experience offer a high value engagement vehicle not available elsewhere.  One great example is audio ads for apps with a strong audio oriented experience.  Additionally, full app takeover sponsorships provide brands a high degree of exposure and offer a more memorable experience than traditional banner campaigns.
  • Direct sales
    A high performance direct ad sales team that knows how to demonstrate the unique value of mobile advertising can be a major benefit. This is difficult to build, however, and requires a certain scale before it’s even feasible.

Generating a sustainable revenue stream is by far the biggest challenge when trying to make the leap from building apps to building a business based on them. Through a savvy and thorough consideration of all the options and app store dynamics, developers can maximize their chances for success.

 

PS3:The best device for Netflix Streaming….

Posted on

PS3 emerges winner leaving behind Xbox and Apple TV!!

Image

The PlayStation 3 has trailed Microsoft’s Xbox 360 in the video game console market for a long time. But not in everything.

Sony announced today on the PlayStation blog that the PS3 is the number one home entertainment device people use to watch Netflix on their TVs. That’s impressive.

Almost any device, it seems, can stream Netflix these days. Netflix is available on the Wii (and the new Wii U), Apple TV, Tivo, Boxee, numerous Blu-ray players, Roku digital players, smart TV platforms from multiple vendors, and, of course, the Xbox 360.

To lead that pack is quite an accomplishment.

“PS3 is our largest TV-connected platform in terms of Netflix viewing, and this year, at times, even surpassed the PC in hours of Netflix enjoyment to become our number one platform overall,” Netflix CEO Reed Hastings said in a statement.

Part of that lead is because PS3 is a very successful mass-market console despite the fact that Xbox has outsold it for 23 consecutive months. Part of it is because PS3 had Netflix early, in 2009, was the first console with 1080p and 5.1 surround sound, and has consistently been on the cutting edge of what’s available, technologically, to control your Netflix viewing.

But part of it is probably also because the Xbox 360, while it offers Netflix, has a wider range of entertainment possibilities, with Microsoft working on original video content, major network TV, YouTube, live TV, and more.

Two newer innovations that Sony has brought to PS3 and Netflix include a new voice user interface system, dubbed “Max,” that is currently in testing, and the ability to pause a Netflix movie on your PS3-equipped TV, and restart it seamlessly on another device, such as your PlayStation Vita.

Innovations like that — and perhaps a breakthrough new console for the coming 2013 updates — might keep PS3 in the lead.

 

5G: Expectation and Beyond!!!

Posted on

5G technology soon to become the norm of the day…!!

Image

You may just be getting used to the benefits of fast fourth-generation (4G) wireless networks, but it’s not too soon to start looking at what 5G could bring.

Sure, the flavor of 4G that’s now dominant — Long Term Evolution, or LTE — is plenty fast. And with T-Mobile rolling out its own 4G LTE network this week, every major U.S. carrier now uses the technology. But will LTE’s maximum speed of 100 megabits per second keep up with our rising data demands over the next decade? And with wireless spectrum already a precious commodity, will we be able to use it more efficiently as we rely on cellular data more?

There’s still plenty we don’t know about the next generation of cellular networks. All we have to go on is the IMT-Advanced specification (International Mobile Telecommunications-Advanced) which lays out a theoretical vision for 5G but isn’t an official standard like LTE.

For now, though, it’s our best glimpse at what’s next.

IMT-Advanced explained

While LTE has a theoretical capacity of 100 megabits per second for downloads and 50 Mbps for uploads (speeds we’ll never actually see in real-world usage), the IMT-Advanced specification features a theoretical maximum download speed of 1 gigabit per second, or about 10 times faster. IMT-Advanced also calls for nominal speeds of 100 Mbps in moving conditions, which means it should be faster than LTE’s maximum speed (and most home broadband connections today) in even the worst scenarios.

IMT-Advanced won’t be the technology to replace LTE, instead it serves as a guidepost which the actual 5G standards are measured against (more on those below).

The International Telecommunication Union (ITU), the leading voice for mobile industry standards, is the organization behind IMT. It’s the same group that set the LTE standard, and it has enough clout within the industry that most carriers are likely to follow its recommendations. However, it’s worth keeping in mind that the ITU also ratified WiMax, an earlier 4G standard that never took off to the same extent LTE did.

The next-gen specification also calls for better resource management on cellular cells (allowing for more users per cell), faster connectivity to networks, and smoother transitioning between networks. For consumers, all of that means 5G networks could have fewer reception dead zones.

Here’s where things get a bit confusing: Technically IMT-Advanced describes a true 4G standard. Today’s 4G technologies, LTE and WiMax, aren’t technically fourth-generation, but as of 2010 they’re allowed to use the 4G descriptor since they represent a significant upgrade over 3G standards.

Given just how synonymous today’s technology has become with 4G, I wouldn’t be surprised if the ITU decides to officially name IMT-Advanced-derived standards “5G.”

5G won’t just be about speed (or phones)

Speed is surely the most obvious upgrade in a next-generation cellular standard, but there’s plenty more that an ideal 5G standard would need to offer. In particular, it needs to help make ubiquitous computing a reality. There’s plenty of talk about the Internet of Things, or connecting just about every single device to the web, but that dream won’t be possible with the power-hungry LTE standard.

With 5G, we’ll need low-power modes so devices can sip data without being recharged frequently. That would be similar to how Bluetooth 4.0 allows fitness gadgets to stay connected to modern smartphones without being battery hogs. We’ll undoubtedly see some sort of battery technology innovation over the next decade, but even with massive improvements, batteries will likely remain the last bottleneck for plenty of devices.

Future cellular networks will also need to use wireless spectrum as efficiently as they use power. For years we’ve been hearing about the supposed “spectrum crunch” in the U.S. from the FCC and wireless carriers — mostly, because there simply isn’t enough wireless spectrum to go around.

“What carriers really need is more overall capacity (not the same thing as speed),” said Jack Gold, principal analyst at J. Gold Associates in an email to VentureBeat. “And more capacity requires more available frequencies, which are in short supply.”

The IMT-Advanced specification’s push to handle more users per cell is one way we could see more efficient usage of wireless spectrum, but we’ll certainly need even more innovations down the line.

We’re already seeing some U.S. carriers take the focus off of phones and tablets. Verizon Wireless’s CES booth this year was filled with real-world applications of the company’s LTE network outside of typical devices, including a football helmet that keeps track of player impact data and a public recycling bin that alerts its managers when full.

The 5G contenders

Expect another standards battle as the road to 5G becomes clearer over the next few years. So far, it looks like the strongest 5G candidates to match the IMT-Advanced specification  are LTE Advanced and WiMax release 2 (AKA 802.16). Talk about repeating history.

Naturally, both are massively improved versions of existing 4G technologies. The latest LTE Advanced specification calls for speeds up to 300 Mbps, while WiMax release 2’s theoretical capabilities should be similarly upgraded (we’re still waiting on concrete information from IEEE, the group behind WiMax and Wi-Fi).

Looking at LTE’s overall dominance in existing 4G networks, I’d wager that LTE Advanced will end up being crowned as our next cellular standard. WiMax networks from Sprint and Clearwire were the first official 4G networks, but LTE has blown WiMax out of the water with much faster speeds. Now Sprint is desperately readying its own LTE rollout to keep subscribers from jumping ship.

Though LTE Advanced and WiMax release 2 are the main 5G candidates so far, we could see some fresh entrants over the next few years. Given just how long these standards take to draft and implement, though, I wouldn’t bet on that.

A long, bumpy road ahead

Don’t expect to see carriers rushing to prepare for 5G networks anytime soon. It’ll be years before the next-gen specifications are finalized, and right now carriers are far more concerned with making sure their investments in LTE 4G networks pay off. Until we have a better sense of how 5G networks will benefit carriers, they likely won’t be too eager to think about spending billions once again on more network upgrades.

“LTE promised both higher bandwidth and better, more efficient use of that bandwidth (hence more profitable operations for the carriers) — it hasn’t really lived up to that promise so far,” noted Gold. “What will 5G offer?”

Bazooka Attack slowing down the Internet…

Posted on

 

One of the largest ever cyber attacks is slowing global internet services…

A “bazooka” cyber attack described as the most powerful ever seen has slowed traffic on the Internet, security experts said Wednesday, raising fresh concerns over online security.

The attacks targeted Spamhaus, a Geneva-based volunteer group that publishes spam blacklists used by networks to filter out unwanted email, and led to cyberspace congestion that may have affected the Internet overall, according to Matthew Prince of the US security firm CloudFlare.

The attacks began last week, according to Spamhaus, after it placed on its blacklist the Dutch-based Web hosting site Cyberbunker, which claimed it was unfairly labeled as a haven for cybercrime and spam.

The origin of the attacks has not yet been identified. But a BBC report said Spamhaus alleged that Cyberbunker, in cooperation with “criminal gangs” from Eastern Europe and Russia, was behind the attack.

The New York Times quoted Sven Olaf Kamphuis, who claimed to be a spokesman for the attackers, as saying that Cyberbunker was retaliating against Spamhaus for “abusing their influence.”

But Kamphuis told the Russian news site RT that Cyberbunker was just one of several Web firms involved, protesting what he called Spamhaus’s bullying tactics.

“Spamhaus have pissed off a whole lot of people over the past few years by blackmailing ISPs and carriers into disconnecting clients without court orders or legal process whatsoever,” he said.

“At this moment, we are not even conducting any attacks,it’s now other people attacking them.”

CloudFlare, which was called for assistance by Spamhaus, said the attackers changed tactics after the first layer of protection was implemented last week.

“Rather than attacking our customers directly, they started going after the network providers CloudFlare uses for bandwidth,” Prince said.

“Once the attackers realized they couldn’t knock CloudFlare itself offline,they went after our direct peers.”

Prince said the so-called distributed denial of service attack (DDoS), which essentially bombards sites with traffic in an effort to disrupt, was “one of the largest ever reported.”

Over the last few days, he added, “we’ve seen congestion across several major Tier 1 (networks), primarily in Europe, where most of the attacks were concentrated, that would have affected hundreds of millions of people even as they surfed sites unrelated to Spamhaus or CloudFlare.”

“If the Internet felt a bit more sluggish for you over the last few days in Europe, this may be part of the reason why,” Prince said in a blog post called “The DDoS That Almost Broke the Internet.”

Prince noted that these attacks used tactics different than the “botnets” – these came from so-called “open resolvers” that “are typically running on big servers with fat pipes.”

“They are like bazookas and the events of the last week have shown the damage they can cause,” he said. “What’s troubling is that, compared with what is possible, this attack may prove to be relatively modest.”

A spokesman for the network security firm Akamai meanwhile told AFP that based on the published data, “the attack was likely the largest publicly acknowledged attack on record.”

“The cyber attack is certainly very large,” added Johannes Ullrich of the US-based SANS Technology Institute, saying it was “a factor of 10 larger than similar attacks in the recent past.”

“But so far, I can’t verify that this affects Internet performance overall,” he said.

Spamhaus, which also has offices in London, essentially patrols the Internet to root out spammers and provides updated lists of likely perpetrators to network operators around the world.

CloudFlare estimates that Spamhaus “is directly or indirectly responsible for filtering as much as 80 percent of daily spam messages.”

The attacks began after Spamhaus blacklisted Cyberbunker, a Web hosting firm that “offers anonymous hosting of anything except child porn and anything related to terrorism.”

Cyberbunker denounced the move on its blog.

“According to Spamhaus, CyberBunker is designated as a ‘rogue’ host and has long been a haven for cybercrime and spam,” the Cyberbunker statement said.

“Of course, Spamhaus has not been able to prove any of these allegations.”

Prince said of the latest incident: “While we don’t know who was behind this attack, Spamhaus has made plenty of enemies over the years.

“We’re proud of how our network held up under such a massive attack and are working with our peers and partners to ensure that the Internet overall can stand up to the threats it faces.”

Experts said the attacks flodded Spamhaus servers with 300 billion bits per second (300 gigabytes) of data. Prior DDoS attacks have been measured at 50 gigabytes per second.

Because of the way Internet traffic flows, these DDoS attacks created congestion and ripple effects around the Web.

 

Is small cell a viable option for 4G data influx…??

Posted on

Will small cell live upto their promise..?

Image

As data usage on mobile networks continues to grow, small cells will have an increasingly important role to play in supporting mainstream network providers, and boosting signal in areas where coverage is weak.

Already, there are large parts of the UK where consumers and businesses struggle to get good 3G coverage, due to base stations being too far apart or because the topography prevents radio signals from reaching particular areas. These areas are commonly known as “not-spots”.

With 4G technology expected to arrive in the UK on a wide scale in 2014, the data demand on mobile networks will become even greater, as use of bandwidth-hungry applications such as media streaming and video calls increases, as well as new applications like machine-to-machine (M2M).

The unprecedented growth of mobile data traffic means that data demand is outstripping network capacity. As a result, Juniper Research predicts that small cell and WiFi systems will carry nearly 60 percent of all mobile traffic over the next five years.

Small cells – such as femtocells, picocells, and microcells – are miniature base stations that provide a low-power signal in confined areas, such as indoor environments and remote outdoor locations, resulting in better voice quality, higher data performance and better battery life.

Ofcom has indicated that small cells will have to be incorporated into the 4G Long Term Evolution (LTE) network infrastructure if mobile operators are to cope with the massive surge in demand for data.

By offloading mobile data traffic onto available complementary networks, operators can optimise the available network resources and reduce the bottlenecking of services. But are small cells advanced enough to deal with the LTE onslaught?

LTE small cells come in many form factors; as well as femto, pico and microcells, there are also small cells optimised for indoor or outdoor use, those that support Time Division Duplex (TDD) and Frequency Division Duplex (FDD) variations of LTE, and those that operate at specific frequencies.

Jayanta Dey, vice president and head of R&D and consultancy for telecoms at Wipro Technologies, said that the versatility of LTE small cells is an advantage, because it means they can be used in a wide range of situations, but also creates challenges from a product engineering point of view.

One way to resolve this issue is to use a scalable software architecture for various deployment configurations. Wipro offers its own software architecture that can be easily ported to the different types of small cells.

“You need a common software that will run across the various hardware configurations, whether it is femto, pico, micro, metro, indoor or outdoor,” he said.

Dey said that, small cells is both a growth market and highly competitive. In order to succeed, therefore, it is essential to have the right price-performance ratio.

“That can only happen if you have a very tight integration between the software, the hardware, and the the mechanical design,” he said. “It is not just about the software, it is how you do a tight integration across the various components for optimum system performance.”

As small cells evolve, an important development will be the integration of multiple technology support within the same box, according to Dey.

“You don’t want separate boxes – one box supporting LTE small cells, one supporting 3G small cells and a third for WiFi. You want one integrated box which supports all this technology,” he said.

Interference management is also an important area of R&D, because the amount of cellular coverage area subject to inter-cell interference grows from 25 percent with macro cells to 40 percent with macro and small cells. To prevent interference, the macro and small cells need to be coordinated, said Dey.

Finally, usability needs to be improved, in order to enable self-installation and allow organisations and individuals to actively monitor these devices. This should lead to more widespread integration of small cells into the 4G network.

Wipro is also developing a lightweight packet core for public safety, defence, large enterprises and rural markets. The company said these use cases require a small footprint and ruggedised systems that support multiple mobile broadband technologies.

“In a situation where commandos need to enter a public building, like a hotel or a hospital, and they are involved in some sort of combat with a terrorist group for example, they need to be able to set up a network to communicate with the central team,” said Dey.

“The central team could be using a base station in a jeep that is right outside the building, and the commandos have these units which are transporting video images to the central unit so that it aids in making better decisions. We are seeing this type of usage for public safety.”

An enterprise example could be an oil drilling company needing to set up a small LTE network to communicate with employees in remote locations and send large packets such as videos.

“For these type of enterprises that operate in remote areas but have some critical operations, small networks may become very important,” said Dey.

At this year’s Mobile World Congress, carrier and vendor executives warned that standardisation of small cells is still a work in progress. There may be hidden costs, they warned, and carriers may end up fighting over spots to set them up.

Even though the cells cost much less than macro equipment, each still needs to be installed and have a fast backhaul connection, which is usually wired. Small cells mean more cells in a given area, so more wires and bandwidth charges for the mobile carrier.

“Even though the cells cost much less than macro equipment, each still needs to be installed and have a fast backhaul connection, which is usually wired,” said Ovum analyst Daryl Schoolar. “Small cells mean more cells in a given area, so more wires and bandwidth charges for the mobile carrier.”

Small cells may be the key to enabling the mobile networks of the future, providing an underlay to support the world’s macro networks. However, there is still a lot of work to be done before their full potential can be realised.

Five Smart Apps Of Today..!i!

Posted on

Apps that boosts your productivity….!

 

Image

Smartphones and tablets powered by Android are the most popular across the world as they give users access to a host of free apps, e-books and games. While the Google Play marketplace has a large number of apps meant for entertainment, it also has a huge repository of apps that boost productivity. 

Want to know what more you can do with your Android smartphone? Here are five must-have apps for a productive lifestyle.

1.AirDroid is a fast, free app that lets you wirelessly manage your Android from your computer browser. Transfer files between Android devices and computers. You can cut, copy, paste, search, rename or delete files on the microSD card, receive, send, forward or delete SMS messages. 

Furthermore, you can install, uninstall, backup and search apps, as well as do batch processing. The same goes for managing your photos, contacts, clipboard and ringtones. If you are a music buff, you can play, search, upload, download, delete, or set as phone call, notification and alarm ringtones.

2. Addicted to newsfeeds? Then check out Feedly. This is a new way to browse the content of your favourite sites, RSS feeds, Tumblr blogs and online video channels. Instead of having to hunt down for information, feedly uses RSS to aggregate the contents of the news sites you like and deliver them as a fast mobile-optimised experience. 

Feedly is an RSS news reader re-imagined for Android phones and tablets. It makes browsing faster: the content of your RSS feeds are transformed into pocket-sized ‘cards’ onscreen, which load very fast and are easy to browse through.

3.Splashtop is the easiest and fastest remote desktop app to access your Windows, Mac, or Ubuntu from your Android phone or tablet, anywhere, anytime! 

Get full access to all your applications (for example office documents, email, full browser with Flash and Java support), PC or Mac games, videos (1080P), and documents on your computer over Wi-Fi or 3G/4G with intuitive touch experience (supporting Windows 8 gestures seamlessly).

 
4.This one should delight the serial note-taker in you. A simple notepad application, the new version has a send feature built in – just choose ‘text’ or ’email’ etc, and off you go! No note is sent before its time, and all your punctuation and spelling remains the way you want it. It is also useful for more than just taking quick notes. 

Because each note is simply a text file in a folder, an entire hierarchy or outline of notes can be synchronised on an item-by-item basis with notes on a computer by simply connecting your Android phone as a USB drive and using a free utility such as ‘SyncToy’. Any text editor on your computer can then be used to update the same data.

5. Easily the most feature-filled Android-PC remote, Unified Remote is an app that turns your Android device into a Wi-Fi or Bluetooth universal remote control for your Windows PC. Control your favourite programmes, mouse, and keyboard. 

Some of the features that the free version of this app includes are: Automatic server detection, single/multi touch mouse, hardware volume control, quick switch using swipe gestures and an auto-pause for media when you get a phone call.

 

iphone 6 on the anvil….i!

Posted on

iphone 6 :look,design and features.

Image

When Apple unveiled the iPhone 5, the reaction was a bit muted: where previous phones were massive leaps forward, the iPhone 5 was a bit longer and a lot easier to scratch.

So what can we expect from the next iPhone, the iPhone 6 or 5S? Let’s see what the crystal balls are saying.

One thing is for sure, with the release of such super handsets as the Samsung Galaxy S4, Sony Xperia Z and HTC One, the next iPhone will have to seriously up its game.

iPhone 6 and the iPhone 5S are two different phones

The rumour mill doesn’t seem too sure whether the next iPhone is going to be the iPhone 5S or the iPhone 6. Given the iPhone’s history – from the 3G onwards, there’s always been a half-step S model before the next numbered iPhone – we’d bet on an iPhone 5S first and an iPhone 6 a few months later.

iPhone 6 release date

Some pundits predict a summer release for the iPhone 6, while Money Morning reckons that the iPhone 5’s lack of NFC and Jumbotron display is because Apple’s got a proper iPhone ready for a springtime release.

It’s quite likely that Apple is moving to a two-phones-per-year upgrade cycle, but we’d bet on a springtime 5S model and a bigger, iPhone 6, update in the Autumn, probably September.

Even Digitimes reckons a springtime iPhone 6 is unlikely: it’s predicting a summertime reveal for Apple’s next generation phones, which again fits with a WWDC unveiling.

iPhone 6 cases

Multiple rumours say Apple’s working on plastic cases for its next iPhone, mixing plastic and metal in such a way that “the internal metal parts [are] able to be seen from outside through special design.”

It’s unclear whether such cases would be for the iPhone 5S or iPhone 6, or if Apple is simply considering making cheaper iPhone 4s to sell when the iPhone 3GS reaches the end of its life.

Speaking in March 2013, a KGI analyst said it believed Apple would turn to manufacturer Pegatron to make up to 75 per cent of low cost iPhone products.

The iPhone 6 will finally do NFC

That’s what iDownloadblog reckons, quoting Jefferies analyst Peter Misek: it’ll have a better battery too, he says.

The iPhone 6 will run iOS 7 and the iPhone 5S probably won’t

Developers are seeing new iPhone model identifiers in their server logs: the device identifies itself as the iPhone 6,1 (the iPhone 5 is 5,1 or 5,2) running iOS 7, and its IP address is an Apple one. If the two-phones strategy is true, we’d expect Apple to unveil the next major revision of iOS at its WWDC conference in June, with it shipping on the iPhone 6 a few months later.

iPhone 6 storage

We’ve already seen a 128GB iPad, so why not a 128GB iPhone 6? Yes, it’ll cost a fortune, but high-spending early adopters love this stuff.

iPhone 6 home button

According to Business Insider, of the many iPhone 6 prototypes Apple has made, one has a giant Retina+ IGZO display and a “new form factor with no home button. Gesture control is also possibly included” – more on that shortly. Mind you, it was mooted that Apple would dump the home button in time for iPhone 5, but it never happened.

iPhone 6 screen

Take this one with a pinch of salt, because China Times isn’t always right: it reckons the codename iPhone Math, which may be a mistranslation of iPhone+, will have a 4.8-inch display. The same report suggests that Apple will release multiple handsets throughout the year over and above the iPhone 5S and 6, which seems a bit far-fetched to us.

Patents show that Apple has been thinking about magical morphing technology that can hide sensors and even cameras. Will it make it into the iPhone 6? Probably not.

iPhone 6 processor

Not a huge surprise, this one: the current processor is a dual-core A6, and the next one will be a quad-core A7. The big sell here is more power with better efficiency, which should help battery life.

Expect to see it in the 2013 iPad first, and expect to see an improved A6 processor, the A6X, in the iPhone 5S.

iPhone 6 camera

Apple’s bought camera sensors from Sony before, and this year we’re going to see a new, 13-megapixel sensor that takes up less room without compromising image quality.

iPhone 6 eye tracking

One thing seems certain – Apple can’t ignore the massive movement towards eye-tracking tech from other vendors, especially Samsung. It seems a shoe-in that Apple will deliver some kind of motion tech within the next iPhone, probably from uMoove.

The new iPhone will have better 4G LTE

On its UK launch, just one UK network had 4G LTE: Everything Everywhere, which currently offers 4G on the 1800MHz band. In 2013, all the other big names will be coming on board, offering 4G in other frequency bands. International iPhones already work across different 4G bands to the UK, so you can expect the UK iPhone 6 (and possibly the iPhone 5S) to be more promiscuous than the iPhone 5.

iPhone 6 Wi-Fi may be 802.11ac

Apple likes to lead Wi-Fi standards adoption – its Airport really helped make Wi-Fi mainstream – and there’s a good chance we’ll see ultra-fast 802.11ac Wi-Fi in Apple kit this year. It’s faster than Lighting, and not very frightening.

iPhone 6 wireless charging

Wireless charging still isn’t mainstream. Could Apple help give it a push? CP Tech reports that Apple has filed a patent for efficient wireless charging, but then again Apple has filed patents for pretty much anything imaginable.

The tasty bit of this particular patent is that Apple’s tech wouldn’t just charge one device, but multiple ones.

Apple to launch budget iphones…!!

Posted on

Low cost i phones soon to beat Android.

Image

 

We have heard for a while a rumour that Apple is preparing a low-cost version of its iconic iPhone which will be launched in emerging markets to level the playing field in the race against Android.

We have heard that the cheaper iPhone will have a larger 4.5-inch display and will be launched alongside two other iPhones, but the buzz surrounding the iPhone Plus or Math device has died down. However, a more affordable version of the iPhone is still on the cards. But now details are emerging from Korea that say the company’s new entry-level handset will have a 4-inch display just like the iPhone 5.

The details on Apple’s so-called budget iPhone were revealed to AppleInsider by KGI Securities analyst Ming-Chi Kuo. According to him, the specs for Apple’s low-cost iPhone were decided way back in 2011, and the company is “unlikely to abruptly change” due to a market shift towards larger-display phones or phablets.

The new, less expensive iPhone model, according to Kuo, will have a “super-thin plastic casing mixed with glass fibre.” The material will make it stronger, thinner and lighter than typical smartphone plastic casings seen regularly on devices produced by the likes of LG and Samsung.

Kuo also revealed that the handset’s casing will be ultra thin, clocking in between 0.4 and 0.6 millimeters. The phone will also reportedly come in a range of colours, much like the current iPod touch line-up. According to Kuo, there will be four to six colour options.

Earlier this week, Kuo had reported that the Cupertino-based company was likely to diversify manufacturing for its low-cost and legacy iPhones. Foxconn or Hon Hai will build the thinner plastic casings and also assemble the phones while Green Point of the Jabil group is expected

to provide more casings to Pegatron, which will assemble the remaining batches of the iPhone. However, contrary to earlier rumours, Foxconn has not been dropped by Apple as a manufacturer. In fact, the company will be producing the next-gen iPhone 5, which will be released along side the low-cost variant.

The iPhone 5S, as it is rumoured to be known, will be based on the design of the current iPhone 5. The premium iPhone 5S will be a refresh like the move from iPhone 4 to 4S and will be announced in line with Apple’s release cycle. The next high-end iPhone is expected to include a fingerprint sensor under the home button that will eliminate the need to enter passwords and potentially add new functionality such as secure e-wallet transactions through Passbook.

Broadband:Scope and reality…

Posted on

Few things every user should know about broadbandi!

Image

The very fact that you’re reading this means that you have access to some form of Internet connectivity. In fact, everybody and their uncle is online in this day and age of dime-a-dozen ISPs and the all pervading reach of mobile Internet. Unfortunately, providing bandwidth and requisite infrastructure and procuring necessary licences is a Herculean effort involving a phenomenal amount of investment. This means only big corporate entities can take up the expensive proposition of connecting you to the repository of collective human consciousness that we know as the Internet. Unfortunately, large corporate entities tend to squeeze every single paisa out of their customers and generally use their army of lawyers to tweak the fine print to the consumers’ detriment.

The point is, if you have a niggling doubt that you are being cheated by your ISP (Internet Service Provider), chances are you most likely are. Cheated—not in the way that would constitute a criminal case, but generally being disingenuous about the terms of service and advertising to create expectations that are eventually met with disappointment. If you have ever felt that you are getting lower bandwidth than you had paid for or have unknowingly run up a massive bill, it’s time to get acquainted with the terminologies and idiosyncrasies of your Internet plan to avoid common pitfalls and make the best out of it.

What is broadband?
In its strictest definition, broadband involves the ability to access multiple channels of data over a single telecommunications medium, using all manners of multiplexing techniques. However, as laymen we understand it as the speedy Internet connectivity delivered over cable, optical fibre, and wireless medium, as opposed to the good old days when logging on to the Internet meant listening to funny sounds emanating from the modem.

The definition of broadband in terms of industry standards encompasses certain minimum speed and connectivity prerequisites. In most of the civilised world, the minimum bandwidth (speed) is pegged at 512 kbps. In India, however, an ISP must provide at least 256 kbps to qualify as broadband. Here’s the textbook definition of the term as per TRAI (Telecom Regulatory Authority of India):

“An ‘always-on’ data connection that is able to support interactive services including Internet access and has the capability of the minimum download speed of 256 kilo bits per second (kbps) to an individual subscriber from the Point of Presence (POP) of the service provider intending to provide Broadband service where multiple such individual Broadband connections are aggregated and the subscriber is able to access these interactive services including the Internet through this POP. The interactive services will exclude any services for which a separate license is specifically required, for example, real-time voice transmission, except to the extent that it is presently permitted under ISP license with Internet Telephony”.

Understanding Throughput and Data Usage Limits
Unless you are one of the few lucky souls with access to completely unlimited and FUP-free Internet connections, download limits are the sad reality that underscore an irony where the content is free on the Internet, but the delivery mechanisms are prohibitively expensive. Most ISPs these days offer higher speeds but lay down proportionally overzealous download limits to curb bandwidth usage.

Oh, did I say download limit? Well, I think I did. That’s incorrect and, in fact, the most common mistake committed by consumers when calculating their usage limits. You see, usage is measured in terms of the maximum throughput through your Internet account. This isn’t just restricted to downloads alone, but factors in uploads as well. That 25 MB presentation you sent a colleague or the 2 GB of data you backed up on your “cloud” drive—all count towards your data usage. So if you have been hitting your usage limit without downloading nearly that much, now you know why.

 
Of Bits and Bytes
Anyone with basic knowledge of computers will tell you MB (Megabytes) is not the same as Mb (Megabits), even though somebody should tell that to the smaller ISPs. While they willy-nilly announce plans that appear fast, the user has to figure out the actual speeds they are entitled to. Do not fall for cheap cons like advertisements offering a 12MBps line, because if it sounds too good to be true that probably is the case. A clear example of how this affects users is seen when comparing the actual download speeds to the ISP’s advertised speeds. Do not be surprised if a 10MB file can’t be downloaded in 10 seconds on your 1Mbps line. That’s because 1 byte = 8 bits, and when you apply a bit of math it comes down to 80 seconds instead. So if a provider delivers a 1Mbps connection to your home, the ideal download speed should be around 128KBps or 1/8th of the broadband speed, provided you are the only user sharing that connection.

 

FU Policy
Fair Usage Poilcy is a lovely marketing term that—like all marketing drivel—has very little meaning for the end user. We like to call it the ISP’s FU Policy. What it essentially means is that the service provider decides an arbitrary usage limit for their tariff plan and drops the speed of the connection when you hit that limit. There are however, some broadband services like the MTNL Triband that do not drop F-bombs on its users. However, we’ve all experienced this at some point or the other, and the particularly annoying part is there’s no easy way for users to check their usage, especially when the service provider is a local last-mile provider, our beloved (and mostly thuggish) cable-wallahs. According to TRAI regulations, the service provider has to intimate the end user when 80 percent of their usage under FUP has been reached.

We checked with a couple of ISPs about how customers can check their FUP limit. You Broadband has a web interface, where users can login to check their limit. Tikona Broadband has a similar setup for those using their FUP plans, but they also have other truly unlimited plans. Airtel claims to intimate users when their limit has been reached, but Airtel users we spoke to apparently haven’t received any such intimation. In any case, it should not be the prerogative of the user to check their usage, since it is the ISPs that have mandated it in the first place. The TRAI statement about the same is very clear on this. Making a user log on to a website and use more of their allowed throughput limit is not only counter-intuitive but can also be considered as a form of malpractice.

Bone of Contention

Essentially, contention ratio is the maximum number of users who may share the broadband connection with you. As per international regulations, the ideal ratio for home users is 50:1 and 20:1 for business users. TRAI has ruled that many ISPs compromise on the quality of the broadband connection by accommodating a higher number of subscribers per unit. This brings down broadband speeds drastically and congests networks. This is where contention ratio comes into play.

Image

With the number of users multiplying at a faster rate than at which more bandwidth is being made available, the connection speed has suffered and the quality of the connection will also deteriorate as web servers are bound to bounce requests due to lack of bandwidth. Despite the advent of high-speed connections, the number of users has increased drastically; 117 percent in the five years from 2005 to 2010. According to figures released by TRAI, the number of broadband subscribers as of February 2012 is 13.54 million, with market leader BSNL catering to nearly 65 percent of them. A big chunk of users – nearly 5 million – are split between 154 other ISPs.