Technology Android

dilla

Trumpfan17 aka Coonie aka Dilla aka Tennis Dog
Yeah, I was a bit apprehensive of the Edge when getting the S7, but I think the look has grown on me, or I simply just accept that Samsung is doing this Edge thing with all their phones now.

Plus, it adds some more space for the battery, which is nice.

Off topic, but we were looking to replace 2 laptops and 2 desktops in our office with newer machines. Any recommendation for a simple machine that will be mainly used for web browsing to access medical services? I think one requirement is that it has an SSD because most of the computers have an issue of slowing down within a year or two, despite not many programs installed on it. Usually a clean wipe fixes it but I still feel that hard drive health is a big issue, so I thought an SSD would be more responsive, as it has been in our Macs.

The only thing is I don't know what OEM to go for. We've used Dells and had good luck with them but we were looking for something under $700 each, at least for laptops, and he XPS is obviously nowhere near that.

Lenovo?
 

masta247

Well-Known Member
Staff member
Regarding the Edge, it's not the look that bothers me, but how it feels when I try to use it. I can't get used to the distorted feel of the image that gets weirdly stretched by the edges. It always makes me feel like the screen is broken and I can never adjust it to see things properly. Additionally, it feels very attention grabbing, especially in public, while I'd prefer to keep people's eyes away from what I'm doing on my phone. Those things are why it's hard for me to imagine using a curved display device daily, and I would rather get a non-edge version of the S9 if it existed.

In terms of purchasing the computers, you surely want an SSD - it's the single most important upgrade you can make to boost performance and reduce performance degradation with system age. Second, you need enough RAM, 8GB is the sweet spot. Third, you would need fast single threaded performance, so either an Intel 6/7/8 gen or Ryzen. Anything that is AMD but doesn't say "Ryzen" or any Intel chips prior to 5th gen are a no-go. In case you'll have such choice, keep in mind that the 8th gen i3 is the same thing as the 6th/7th gen i5. You won't need a better processor, and you also absolutely don't need an additional graphics card - for office work the standard Intel graphics is more than enough.
Otherwise, I frankly have no idea about the pre-built computer market, as I always just build them, so I have no idea what you would be choosing from, but the aforementioned guidelines should ensure that such computers will perform well for a rather long time, and you should find such within the 700$ budget.

For laptops, 700$ per piece is quite a tricky budget, because just a little more and you could snatch the entry model of the Surface Pro on sale. But a lot depends on the display size you want, because depending on its size you would be looking at different classes of devices.
For Lenovo, sub-700$ models are usually very mediocre. The Thinkpad series is where it's at, but they are usually more expensive.
The Flagship model - the X1 is possibly the greatest office laptop. Quality-wise, it's even better than the Dell XPS series, and is probably the best quality laptop, period. But you can only get refurbished models within your budget, which would still be a great purchase if the condition was really as advertized, as you have to keep in mind this one is a 3 year old model (still seems great on paper, X1 Thinkpads don't age fast, but I never got refurbs from Newegg so your call on that one):
https://www.newegg.com/Product/Prod...&cm_re=thinkpad_x1-_-9SIA5HA5HU1678-_-Product
 

dilla

Trumpfan17 aka Coonie aka Dilla aka Tennis Dog
Well, as usual, my dad gave me the job but butted in at the last second.

We ended up going with Dell since we had an account with them from last summer when he got his XPS 15. I think he settled on two Inspiron notebooks and 2 Optiplex 3050 Micro, small desktops. I believe those have SSDs but the notebooks don't.

Eh, it's his headache now. I think the desktops will be used more often, definitely daily, so those being SSDs seems to relieve most of the headache. The notebooks will be for staff to use in rooms with patients. It'll be fine for them.

Anyway, we mentioned a few weeks back about Apple changing chips.

http://appleinsider.com/articles/18...-security-chips-shift-to-apple-cpu-inevitable

Of course, if this is their own chips, it doesn't mean much in terms of allowing AMD in their machines but it could begin the shift from being on Intel exclusively.

If I'm not mistaken, when it comes to the iPhone and iPad, Apple's chips were superior to the other supplier the used for part of their inventory. Was it TSMC, or something like that? Maybe Apple can bring some competition to Intel itself in a few years.

Also, Dell may go public lol. I have come to like Dells in the past 5-7 year. Didn't have an issue with them before but now that our office is almost exclusively Dell, we have 0 issues with out XPS 12 and 15 notebooks and an XPS desktop as well. The computers we are replacing, that I mentioned above, were Gateways. They had been in service for a long time, though, so I can't shit on them too much.

My dad did purchase an ASUS notebook last summer and that was just a dud. Ended up having to factory warranty our way out of that one. I thought ASUS was top at one point? Maybe 10 years ago, which is also when I first heard about it. ASUS and MSI were all the rage with Anandtech back then.

I remember in 2007, I was saving up for a notebook for college and was looking at CyberPower and iBuyPower and their custom builds, but settled for a modest HP. Now Best Buy sells one of those two brands, I forget which one, online. Those kooky-looking gaming PCs. Times change.
 

masta247

Well-Known Member
Staff member
Anyway, we mentioned a few weeks back about Apple changing chips.



http://appleinsider.com/articles/18/01/29/up-to-three-macs-coming-with-t-series-security-chips-shift-to-apple-cpu-inevitable



Of course, if this is their own chips, it doesn't mean much in terms of allowing AMD in their machines but it could begin the shift from being on Intel exclusively.



If I'm not mistaken, when it comes to the iPhone and iPad, Apple's chips were superior to the other supplier the used for part of their inventory. Was it TSMC, or something like that? Maybe Apple can bring some competition to Intel itself in a few years.


That would be amazing news if they actually made full-fledged X86 chips. The more competition in that sphere the merrier, as Intel has been completely stuck refreshing a 10-year-old technology with minor upgrades.

While the T-series chips don't have anything to do with full Mac CPUS and they are just security chips, the report claim they are looking into making their own CPUs, and that would be awesome, like I mentioned a while ago. If Apple makes powerful enough chips for Macs, Intel will be forced to make better ones for PCs to compete in the US. It's almost hard to imagine a Mac being more powerful than a PC though, but it would be an amazing performance race-starter again.

Regarding TSMC (or Samsung), they make all Apple chips for iPhones and iPads, but the chips are designed by Apple. Apple has no way to manufacture their own chips as they have no factories (fabs), but it's the design that counts though.

My dad did purchase an ASUS notebook last summer and that was just a dud. Ended up having to factory warranty our way out of that one. I thought ASUS was top at one point? Maybe 10 years ago, which is also when I first heard about it. ASUS and MSI were all the rage with Anandtech back then.
.

Asus is a very good laptop maker if you consider the value for money, but they range all over the place from really poor models to really good models, mostly depending on price bracket. If a laptop is designed to be the best for 500$, it still had corners cut to be that 500$ laptop.

If you want absolute best in terms of quality it's the Lenovo Thinkpad series (such as the X1), the Dell XPS series and the Surface series these days. Probably in that order too.
 

dilla

Trumpfan17 aka Coonie aka Dilla aka Tennis Dog
That would be amazing news if they actually made full-fledged X86 chips. The more competition in that sphere the merrier, as Intel has been completely stuck refreshing a 10-year-old technology with minor upgrades.

While the T-series chips don't have anything to do with full Mac CPUS and they are just security chips, the report claim they are looking into making their own CPUs, and that would be awesome, like I mentioned a while ago. If Apple makes powerful enough chips for Macs, Intel will be forced to make better ones for PCs to compete in the US. It's almost hard to imagine a Mac being more powerful than a PC though, but it would be an amazing performance race-starter again.

Regarding TSMC (or Samsung), they make all Apple chips for iPhones and iPads, but the chips are designed by Apple. Apple has no way to manufacture their own chips as they have no factories (fabs), but it's the design that counts though.




Asus is a very good laptop maker if you consider the value for money, but they range all over the place from really poor models to really good models, mostly depending on price bracket. If a laptop is designed to be the best for 500$, it still had corners cut to be that 500$ laptop.

If you want absolute best in terms of quality it's the Lenovo Thinkpad series (such as the X1), the Dell XPS series and the Surface series these days. Probably in that order too.


The X1 is considered better than even the XPS? I thought the XPS was kind of the go-to for people that wanted something with a similar footprint as a MBP but wanted Windows. In other words, the best Mac alternative.

The Surface Book, too, I heard was great but some people had issues with the hinges and then the keyboard, or something. But MS and Dell, both, owned the top market is what I thought was the consensus.

Nothing about MSI? I still see them around from time to time on Newegg.
 

dilla

Trumpfan17 aka Coonie aka Dilla aka Tennis Dog
I think the S9 "leak" I posted a few days back might have been BS? A shitty render, or something.

Came across this, which seems to be from a more reputable source: https://www.gsmarena.com/galaxy_s9_components_including_battery_pictured-news-29425.php

Off topic, but the Optiplex we ordered finally came in. Well, both of them did. Made a silly mistake and didn't read that it came only with HDMI output, no VGA. So I'll get some cheap, dinky adapter and hope it works. Otherwise, this thing is pretty neat. Obviously no optic drive, 2 or 3 USB ports, and an SSD. It's pretty zippy and is the first desktop I've used with an SSD, funny enough. Windows machine, at least; I have used the recent iMacs.

The Dell Latitudes are completely shit. But they won't be used all that often so big deal.
 

masta247

Well-Known Member
Staff member
The X1 is considered better than even the XPS? I thought the XPS was kind of the go-to for people that wanted something with a similar footprint as a MBP but wanted Windows. In other words, the best Mac alternative.



The Surface Book, too, I heard was great but some people had issues with the hinges and then the keyboard, or something. But MS and Dell, both, owned the top market is what I thought was the consensus.



Nothing about MSI? I still see them around from time to time on Newegg.
The XPS series is like a Macbook alternative, yes. As in, it is a consumer product that can also be used for business.

The X1 is like the highest grade enterprise product that is also being sold at retail. Those are the computers that break very, very rarely.
Even if they are the same on paper, you can be sure that their critical components are of cutting edge quality.

I had two Thinkpad laptops. I bought both used when I was a student. One is over 10 years old now. It was in constant use until two years ago by my mom.
Even the battery still holds charge for almost 2 hours, and it never had a single issue. At that, if something was ever to fail, the components that are more likely to fail than the rest, are easily accessible to replace. You pop battery or hard drive out, and pop in a new one, and it works great again. Those computers are made for long years and unless you are very unlucky, they will last those long years.

The Surface line is designed for quality, but sometimes something retarded pops up, such as a random battery drain when the computer is asleep. It usually gets sort-of fixed through updates, but such things happen. I had an update that randomly made my speaker "pop" whenever the computer was turning on. It was also mostly fixed, but maybe after a month.
Another problem is that the Surface computers are very under-specced for the price. For 900$ you are getting 4GB of RAM and the slowest possible Intel Core processor, a slow 128GB SSD drive and even the highest, absolutely ridiculously priced options don't provide a dedicated graphics. I like my Surface Pro, it is very well made and the display quality is second to absolutely none, but it has its occasional problems, and they aren't as perfect as the Thinkpads, which are virtually indestructible (many of them are actually drop-proof and spill-proof even when not advertised).

Still, both the Surface and the Dell XPS series are very, very good computers, but the X1 is probably the highest quality laptop range if you need it. You can buy a 4 year old Thinkpad that in the end will likely still last you longer than a new Surface, which will still last much longer than an average laptop. Of course on average, as shit can happen to the best of them.


It's tough to say how much phones cost with a 2 year contract, which are now defunct, but knowing how much a phone costs with the installment plans carriers have now, this is disheartening: http://www.techradar.com/news/samsung-galaxy-s9-price-leaked-industry-insider-indicates-cost-will-be-789



I feel it would be easier to not upgrade via my carrier and lose my 2 year contract pricing, and just buy a phone outright.
If true, that would be insane. The S8 was already overpriced, which was the main reason I didn't get it.It's not like the new components require such higher costs. Sure, memory is getting more expensive, but not by that much.

I would like to upgrade, but I don't see a viable option of doing so without feeling severely ripped off.


Speaking of chips, Apple is moving to Intel over Qualcomm?



https://9to5mac.com/2018/02/04/kgi-2018-iphone-qualcomm/?pushup=1



KGI seems like a reputable source, even if they are just analysts making predictions.
Both companies operate in a way that as soon as they have a chip that on paper can compete with the industry standard (even if some corners are cut) they are throwing money at the makers to make them use theirs, as opposed to picking established companies. I guess Intel has more money to throw.

Intel and Qualcomm are both asshole companies, but Intel is the OG asshole who invented those tricks.
They seems to have good "working relationship" with Apple and by that I mean Intel probably does its best to undercut Qualcomm and throws in extra incentives, like discounts on laptop processors or guaranteed prices for future products.
 

dilla

Trumpfan17 aka Coonie aka Dilla aka Tennis Dog
https://arstechnica.com/gadgets/2018/02/android-wear-is-getting-killed-and-its-all-qualcomms-fault/

While I wouldn't turn down an Android Wear or Apple Watch, I still haven't been itching to get one, at least not as of this moment. My sister has an Apple Watch to complement her iPhone 7+ and I know even she forgets to wear it unless she actually comes in to a situation where she would want it. Like when "no phones allowed" when she's working in the hospital, but she can still get messages on her watch. Or, you know, check the time.

Still, I see so many others with Apple Watches and it's nuts. Can't say the same for Android watches. I walked by a few watch stores, including Moved, the other day and while they do have a poster of the one or two Android Wear models they're selling, I still don't see the hype and excitement by customers or stores to sell their smartwatches. At least no where close to what I see at vendors that offer the Apple Watch like Best Buy and other tech stores.

Samsung was giving away their Samsung smartwatches with S7 and S8 purchases, and I still don't see people wearing them.

Some comments I read from Android users was that they are moving closer to Apple in general because they don't think Google has a vision or strategy for any of their services and products, which is a common complaint from anyone that uses Google Services outside of Google Search and Gmail. Or owns a Nexus Player. Or Nexus-anything.

I have to look in to the limitations of an Apple Watch with an Android device, which I'm sure there are many, just so I know for future if wearable tech gets as big as Apple has made it out to be. I wouldn't switch to an iPhone just to have an Apple Watch, and I wouldn't necessarily even get an Apple Watch, but some Android smartwatches might look better than the Apple Watch's generic design, but if it's functionality is hit-or-miss and Google lets the Android Wear OS remain stagnant after a while, what's the point?
 

masta247

Well-Known Member
Staff member
^I'm yet to see an Apple watch in the wild. If they sell, I assume it's more of a US thing? I only saw a couple of Samsung and Sony watches, and one from Asus back in the day, but none of them are popular at all, to the point it's a very rare occurrence to see one in general.

Personally, I'm not into Smartwatches the way they still are. Having to look at my phone a couple of times rather than checking watch when I'm out feels like much less hassle than having to charge the watch every single night. Especially since checking notifications coincides with me doing other things on the phone anyway, and if I saw a notification on my watch I would grab my phone to respond anyway, which is exactly the same action I would do if I heard a vibration in my pocket. I just don't see a point.

We took advantage of the Gear Fit deal that came with the S7. My girlfriend has it. It has a longer battery life than any other smartwatch, yet she still isn't using it, as even weekly charging is bothersome compared to a regular watch that just works for long years. It was a fun toy for a few days, but for regular use it doesn't add any value, yet is additional hassle.

Regarding the article, with the current technology I don't think we'll have any amazing smartwatch anytime soon, so I'm not surprised the companies aren't bothering with minor refreshes that would extend the battery by maybe 30%, as that costs but doesn't solve any problems. Besides, Ars' point is invalid, as anyone wanting to make a "better" smartwatch is free to use the Samsung smartwatch chips, which they are happily selling to anyone who wants one. They are the smallest and most efficient smartwatch chips in existence yet, which still isn't saying much. The point is, nobody's buying, and nobody is investing much into their smartwatches except Samsung and Apple, and even those aren't popular.
 

dilla

Trumpfan17 aka Coonie aka Dilla aka Tennis Dog
I think there's definitely a cultural divide on Apple vs Android in the US vs the rest of the world. Apple Watches are everywhere as are Apple devices. Pharmaceutical reps come in with iPad Pros for their presentations on a drug. Many of them are wearing Apple Watches, or regular watches; certainly not an Android smart watch. They are also almost always with an iPhone.

It is entirely possible that the pharm companies are providing a lot of these devices. They certainly are providing the iPad Pros and possibly the iPhones, but I'm not so sure about the Apple Watch. I feel like at some point the rep's preferences come in to play and they too buy Apple devices for personal or business uses to complement their Apple business devices. I think companies in the US that provide these devices just have all most embraced Apple. Maybe because Apple is a US company and they want to "go American?" Maybe Apple just plays nice with most any system a company would have in place for email and data-related stuff? I don't know.

I would get a smart watch if it was reasonably priced but I fear the same thing that happened with you and the Gear Watch; that it would just become a novelty after a while and I'd just forget about it. The charging time isn't too big of an issue. I had a Jawbone UP3 fitness tracker, before they went out of business and I wouldn't have any issues in remembering charge it every 5 or so days. I would just take it off, take a shower, and that was enough time to give it 60% charge and another few days usage. I'm not sure smartwatches charge that quickly, though, but I could be wrong. 20 mins charging for another day of use, if you're dangerously low, doesn't seem too far-fetched, though.

But carrying my phone with me is a habit so I can't imagine my phone resting a few inches from my wrist on a table being too much of a hassle and instead resorting to checking messages on my smart watch that would be on my wrist. But I do like the fitness tracker aspect that many watches have. I've heard Apple Watch's sensors are very accurate and now they're working on being able to have diabetics check their sugars via their phones. I'm not sure how that works, but someone in the health care industry has faith in the Apple Watches sticking around as well as their reliability and accuracy in gathering biometrics.

It may be a few years before we know for sure what smart watches' place will be in tech, if at all, but right now seems too early. Still, I think smart watches are selling very well, even if just as a fashion statement and not so much for its (limited) capabilities currently.
 

masta247

Well-Known Member
Staff member
I didn't realize that the Apple watches sell that well there. To me it always felt like the Smartwatch was a hype word that sort of died down, a little like 3D.

Maybe ArsTechnica didn't realize that it's just not a thing globally, since they are American, so it's harder to see that the companies aren't investing in them anymore simply because globally they just aren't popular. Maybe Apple makes enough profit on them on the few markets where they sell fairly well that it's worth it for them? After all it's cheap to make, but is sold for quite a sum. I'm more surprised at Samsung for sticking to them, as they even make decent smartwatch chips that nobody buys. I'm most surprised at Ars Technica being surprised that the companies aren't keen on investing in them.

The batteries charge really fast on the smartwatches. The bigger problem for me is just the whole additional process, and having another device charger plugged in (and another device needing frequent recharges). Heck, even a regular watch already feels like too much hassle for me, so I don't wear one on most days. A smartwatch just substantially adds to the hassle in my book.
I can't see smartwaches becoming interesting again anytime soon - it would take a lot for that to change, such as battery breakthrough and some sort of functionality breakthroughs, as to me it just feels like an unnecessary extension to a phone that sends you notifications that you can't do much with. I don't see them as fashionable, as none of the smartwatches are too pretty, and can't compete with regular watches in that regard, unless someone's trying to go for a geeky style, but by now I think the people who could find that appealing realize that it's more of a gimmick rather than a cool technology, sort of like the cheese LED belts of back in the days. The only smartwatch that I think came close in terms of being fashionable was the Moto 360, but it was just too bulky to pull it off, not to mention that it failed in other regards, and then they didn't bother refreshing it.
 

dilla

Trumpfan17 aka Coonie aka Dilla aka Tennis Dog
Yeah, and it doesn't seem like sales are slowing, but instead growing: http://www.businessinsider.com/apple-outsold-the-entire-swiss-watch-industry-in-2017-2018-2

And now they released the HomePod, so lets see how that pans out for them.

About this issue of fragmentation, it's crazy that it still hasn't been fixed:https://www.gsmarena.com/counterclo...sions_no_longer_reach_majority-news-29554.php

I know OEMs and carriers play a role in all of this, sometimes creating over 2 middlemen to get an update out but this is ridiculous. 4 year old software for some people, or four years ago is when even half of Android's users were on the same OS version.

It sucks the choices for people that want an Android phone is to buy a Google Phone that isn't quite the best in the industry but does get timely updates, or to buy a phone from Samsung and get more features and maybe better performance but remain on an older OS for a longer period of time.
 

masta247

Well-Known Member
Staff member
To be fair it doesn't matter much if you're on anything since Kitkat, as the changes were not substantial, and all apps work on all of those versions anyway. I played with my mom's S4, which is on 5.0.1 (Lollipop) and can't tell a difference other than that it used to look better and had less built-in Google bloat (had more Samsung one that you could easily disable).

On another hand I find it crazy that Android has such update scheme though, and unlike anything else with how bad it is. It is actually a precedent in terms of such a horrible update system in the tech world. With Windows, Mac or any other Linux version you always get updates straight away. Windows is VERY fragmented, but still delivers updates to all computers worldwide right after they are released. But none of those other systems are tied in any way to the drivers or any overlays the user or OEM might have.
Microsoft simply communicates to the skin makers that "we are releasing the new OS version soon, you better be ready!", and so far they have always been ready, you get updates even for 10 year old components, and only recently the new components stopped officially supporting 10 year old systems, but they still work.
 

dilla

Trumpfan17 aka Coonie aka Dilla aka Tennis Dog
To be fair it doesn't matter much if you're on anything since Kitkat, as the changes were not substantial, and all apps work on all of those versions anyway. I played with my mom's S4, which is on 5.0.1 (Lollipop) and can't tell a difference other than that it used to look better and had less built-in Google bloat (had more Samsung one that you could easily disable).

On another hand I find it crazy that Android has such update scheme though, and unlike anything else with how bad it is. It is actually a precedent in terms of such a horrible update system in the tech world. With Windows, Mac or any other Linux version you always get updates straight away. Windows is VERY fragmented, but still delivers updates to all computers worldwide right after they are released. But none of those other systems are tied in any way to the drivers or any overlays the user or OEM might have.
Microsoft simply communicates to the skin makers that "we are releasing the new OS version soon, you better be ready!", and so far they have always been ready, you get updates even for 10 year old components, and only recently the new components stopped officially supporting 10 year old systems, but they still work.

I was thinking this same thing. I'm not sure from a software engineer's standpoint, but the common man it seems like Windows is set up much like Android and that there are so many OEMs that use Windows and they all use such a variety of parts, even within their own line of computers, but you don't hear too many things about compatibility and stability with Windows. If it affects a Dell, it probably affects an HP, Lenovo, Sony, etc. And it's usually a software issue and not a fault of the OEM, or at least not entirely.

Is there such a possibility of Android becoming more like Windows as far as how deep Google has a hand in making sure its OS is more unified across different smartphone OEMs? Or is a mobile OS more complex than a desktop OS? if OEMs want to add there own flair to Android, like TouchWiz or whatever Moto/LG/Sony's equivalent is, why is it more complicated than simply changing a theme on Windows to make it look different?

Like for Samsung, there are many Samsung services that many may consider bloat. Samsung Health, Samsung+, Samsung Music (the streaming app, which might be dead now), Themes, Samsung App Store, Cloud....Why not just make those installable from the Samsung App Store and just have the App Store be the only "bloat" installed by Samsung out of the box? Same with making TouchWiz a launcher like Nova as opposed to the deeper integration it has? I like TW and I wouldn't mind TW out of the box, but not everyone is like me. I don't want all those extra Samsung apps slapped on either, but others might.

It can't just be part of the tutorial when you first get your phone, to show how to install the apps necessary to retain the TW experience, otherwise the phone just comes with vanilla Android out of the box if you opt to install nothing from Samsung?
 

masta247

Well-Known Member
Staff member
Yeah, it would be awesome. Somehow Google doesn't roll that way. They make a new Android release, share the source code with everyone, make sure they ported it to their own Nexus/Pixel phones and literally call it a day.
Windows actually contains generic drivers for everything, to ensure that everything always at least works.
If an OEM wants their proprietary drivers or software supported they have to make it on top of what Windows offers, and share it with Microsoft which can provide it to all machines running a given component marked by that OEM through Windows Update, or the OEM has to provide it otherwise. Basically, the part makers have to try hard to make sure they provide the software that will make their parts work best on a given Windows version, and they always do that.

With Android, the device makers actually are completely fucked, as they are all on their own. They have to do all the work and basically assemble a whole phone OS from many different puzzles, sometimes with some pieces missing. Android is just one part of that, dozens of drivers have to be built in, for each third party component that a given phone uses. For instance, Qualcomm can just say "fuck you" to the likes of Sony and not provide the drivers, therefore Sony would be fucked and couldn't update any phones using that Qualcomm chip. If Broadcom decided it can't be bothered to update a last-gen series of modems to Android 8.0, phones using that chip will not be able to be upgraded to 8.0 just because of that.

Then, on top of that the phone maker has to gather all the pieces together, combine them with the generic Android code, then slap their own skin, apps and other differentiators on top of those, then thousands of carrier requirements for carriers in each country, and then test all versions for each country to make sure that all work. From a software dev's perspective, saying that this is an absolute nightmare is a major understatement. The system is absolutely horrible, and it makes it understandable why OEMs with all the money behind them don't support more than two major Android releases per flagship. If you noticed, Samsung mentioned Oreo for the 3 year old Galaxy S6. Why? Because that is the first Samsung phone that uses literally all-Samsung components, at least.
 

dilla

Trumpfan17 aka Coonie aka Dilla aka Tennis Dog
Yeah, it would be awesome. Somehow Google doesn't roll that way. They make a new Android release, share the source code with everyone, make sure they ported it to their own Nexus/Pixel phones and literally call it a day.
Windows actually contains generic drivers for everything, to ensure that everything always at least works.
If an OEM wants their proprietary drivers or software supported they have to make it on top of what Windows offers, and share it with Microsoft which can provide it to all machines running a given component marked by that OEM through Windows Update, or the OEM has to provide it otherwise. Basically, the part makers have to try hard to make sure they provide the software that will make their parts work best on a given Windows version, and they always do that.

With Android, the device makers actually are completely fucked, as they are all on their own. They have to do all the work and basically assemble a whole phone OS from many different puzzles, sometimes with some pieces missing. Android is just one part of that, dozens of drivers have to be built in, for each third party component that a given phone uses. For instance, Qualcomm can just say "fuck you" to the likes of Sony and not provide the drivers, therefore Sony would be fucked and couldn't update any phones using that Qualcomm chip. If Broadcom decided it can't be bothered to update a last-gen series of modems to Android 8.0, phones using that chip will not be able to be upgraded to 8.0 just because of that.

Then, on top of that the phone maker has to gather all the pieces together, combine them with the generic Android code, then slap their own skin, apps and other differentiators on top of those, then thousands of carrier requirements for carriers in each country, and then test all versions for each country to make sure that all work. From a software dev's perspective, saying that this is an absolute nightmare is a major understatement. The system is absolutely horrible, and it makes it understandable why OEMs with all the money behind them don't support more than two major Android releases per flagship. If you noticed, Samsung mentioned Oreo for the 3 year old Galaxy S6. Why? Because that is the first Samsung phone that uses literally all-Samsung components, at least.


I wasn't aware of Oreo coming to the S6. I have barely heard much about the S7 getting Oreo and the first time was this past weekend when some ROMs were leaked for the ATT variant. I'm not going to flash it, though, because I had a horrible experience with the Froyo leak that came out for the HTC Droid Eris back in 2010 where it had a phone-breaking bug that would cause audio to completely go out unless you restarted the phone. No in-call audio, no audio notifications, nothing.

Otherwise, I'd do it in a heartbeat to see what Oreo offered and if I could squeeze another six months or so out of my S7 and wait for a worthy upgrade if the S9 isn't all that great.

Not to be a drama queen again, but I'm slightly considering the iPhone again but it's tougher this time around because iOS and macOS alike have had some disastrous releases in the past year or so. Stuff that breaks the OS, like special characters causing crashes. I would just appreciate the constant and timely OS releases that Apple puts out so that no phone is limited by software and instead by hardware. I can deal with that. Not many Android OEMs can honestly say that hardware plays a role in their decision to end support.

The closest Android alternative to the iPhone model is the Pixel, and I'm all for it if Google actually releases one that doesn't have some controversy around it, like the Pixel 2's screen or camera or whatever the debacle was all about. I know no phone is safe from scrutiny soon after its release, and a lot of times it's blown out of proportion, but some are worse than others. Nothing tops Samsung's Note 7 lol
 

masta247

Well-Known Member
Staff member
Yeah, Oreo is actually coming to the S6, which I was surprised by also, as I'm still on the S6, and I would have never expected that I will get it on that phone (it's exactly 3 years old now):
http://www.trustedreviews.com/news/samsung-galaxy-s6-note-5-android-oreo-update-3391192

Frankly speaking, 3 years of updates is good enough in my book. On another hand, I can't say I would be sad to miss one or two Android releases, as they haven't been really changing much for a couple years now. Same with iOS. I would be actually happier if my iPad Air remained on iOS 10. It feels like these days they are mostly rearranging menu items and breaking something in the process, and calling it a new OS version. Surely gone are the days of Ice Cream Sandwich, Jelly Bean or even Kit Kat, which came with huge improvements. To add to that, any app from the Play store will still run if you're on Kitkat, 98% will work if you're on Jelly Bean. For developers, the most common API level you would use when developing a new app today would require Android 4.0. You are literally more likely to have outdated hardware to run something, rather than an outdated software version, as devices running Android 4.0 still had ~1ghz single core processors, 512mb of RAM and literally below 1gb of storage (My Xperia Arc running 4.0 had 320mb of total app storage, lol).

Also, it feels like the last software update a given device gets also sort of breaks it, as it's not as well optimized to run on that device. It's understandable, as less resources get dedicated to older models, and that works exactly well for planned obsolescence. Since Jelly Bean, which revolutionized the smoothness on Android, I don't recall a "performance improvement" of a new OS version ever actually improving the performance of a given device - usually, new versions are slower. In my experience the software release that comes with the device is always the fastest, as the most effort is made to make it as flawless as possible to make the device appear better than competition, and sell.
The later updates are there mostly to tick the checkbox, sort of like "we also have the newest Android and its features on that phone - look, we still support it!".

I feel like the times of novelty in terms of software AND hardware have been gone for a few years now. I feel like Smartphones have entered what happened to PCs a while ago - they can be used for much, much longer as they are, as new releases bring much less novelty or improvements (in hardware and software at the same time).

These days all devices get bug fixes for years anyway, and I feel like for huge software updates they were much more critical back in the days, and today they are mostly adding very minor things that don't add much to the overall experience.
 

Latest posts

Donate

Any donations will be used to help pay for the site costs, and anything donated above will be donated to C-Dub's son on behalf of this community.

Members online

No members online now.
Top