Posted on 01/29/2015 5:39:36 PM PST by Red in Blue PA
Samsungs Galaxy S5 was a monster flagship, except weve been able to put together everything you need to know about Samsungs upcoming Galaxy S6, and its shaping up to be the most amazing phone weve ever heard of.
From a CPU thats 50% faster, to an incredible quad-HD display, Samsung hit every angle with this years flagship phone. It will also have a payment feature which works with magnetic and NFC terminals, incredible cameras, and a gorgeous glass and metal body.
This is shaping up to be the Android phone to beat this coming year.
We were sent photos from our trusted source, though we were not allowed to publish them, so youll have to make do with the detailed specifications below.
I want one immediately!
Heres everything you need to know: 64-bit eight-core 14nm CPU which is 50% faster 5.1-inch Quad HD Super AMOLED display with a 577ppi density, stunning outdoor visibility, super dim mode for late night. A huge 20 megapixel OIS camera sensor and a 5 megapixel f/1.8 front-facing camera with real-time HDR 32 / 64 / 128GB of storage 2550mAh battery Built-in wireless charging Four hours of usage on a 10 minute charge Quick connect charging Samsung Pay: works with 90% of existing magnetic stripe payment terminals, and NFC payment terminals Metal and glass body Gorilla Glass 4 Cat 6 LTE
For detailed reviews and specifications go to www.phonescoop.com
very good phone review website.
Without getting too technical, every study I have seen has shown that the average Thread Level Parallelization, (TLP) for ARM cores comes in at 1.4.
This means that only 1.4 cores are doing and the work and in an octo core device 6 of them are just idling 99% of the time.
Not a good use of silicon if you ask me. (BTW I have cored ARMs for ASICs, and they are pricey, but hey Samsung can afford it.)
If you are happy that's good. Samsung is a great company and I love to work with them.....on other things.
I have a way to thwart that... My phone is just used for work and is not surgically attached to my hand or ear... it gets left home A LOT ,, they can listen in to an empty room all they want,, maybe they like listening to my dogs snoring...
I’m seriously considering starting a business where I mod your phone by killing the GPS location chip, burn the power trace on the motherboard... and the same for cars ,, kill the location data and make the memory that stores throttle and brake position and all the other data the police will use against you volatile or so small that it constantly overwrites itself.
Sorry to tell you this, but they don’t use GPS chips anymore, The GPS radio is combined with the bluetooth, the wifi and the cellular radio, and NFC and FM radio functions if equipped into one piece of silicon, or GaAs.
The GPS function is done in software with hardware acceleration.
No chip to break.
Be afraid... Very afraid! The way I see it, Game over!
“It is very hard to write software that actually uses multiple cores ...”
For some people ;-) ... I certainly see your point, but a lot of people are getting a lot better writing applications for parallel processing systems.
Parallel processing in general has gone from “solution looking for a problem” to “solution to performance issues since we can’t continue to increase clock frequencies on CPUs easily” in recent years. It’s not hype ... it’s the best way to design general purpose CPUs moving forward.
AMD’s latest and greatest desktop processor requires 210W to run at 5GHz with all cores blazing ... that is an insane amount of power (watts, not “performance power”) ... that requires some serious cooling.
Intel went the parallel processing route to kind of work around the insane power consumption required by high clock frequency CPUs. It’s not marketing nonsense ... it’s a bona fide solution to a big problem.
Of course, as you pointed out, not many programmers exploit parallelism well ... however, that is certainly changing. Some of the tools I use for FPGA development are doing a great job using multiple processors for a single FPGA build. A lot of gaming companies are FINALLY taking advantage of it ... it’s only a matter of time before something standardized will be used to exploit all of the cores we have sitting in our CPUs :-).
Someones going to make money disabling it somehow ,, if that’s the case it’ll be a software solution.
Yes those of us in RTL VHDL or Verilog land know how to do it. That's why we get the big bucs.
The problem is the average app coder is lazy and just wants it to work so they can get paid, they have no clue of what they are doing.
So here is an interesting question how many cores is your compiler using, be careful, how many did you pay to have unlocked?
Oh, thats yet another thing that you can do with a Samsung but cannot with an Apple.....expand the memory.
preach it brother!
Well you could get an app though android, but never though the MFi, (manufactured for iOS), no way no how.
Why? Old one stop making and receiving calls?
Apple doesn’t even have Swype? One more reason to never buy an Apple.
And there is no way to do that on an iPhony
good point!
I got the Note 4 in November. Absolutely love it. It is so light, speedy. Put a 64 GB card in it. Haven’t made a dent in the storage capacity I have available.
I love the Samsung for a few important reasons. I travel internationally and absolutely REQUIRE the ability to swap batteries. Moreover, I want to carry alot of "stuff" on my phone so expandable memory, especially for a 13 hour flight, is also critical. Moreover, the IP67 rating of the S5 has already saved my butt. that is a Huge factor and benefit.
The rest of the details, to me are personal preference. The iPhone was easier to setup but quickly becomes boring to me. Frankly, there are more "early stage" apps in the Android world and I like to see them and evaluate. Moreover, I bought first a Samsung Gear Neo and more recently switched it for the Moto360 and love the smart watch. There is no way I'm paying $350 or more for the apple equivalent when it won't do much more, if at all than my current watch.
But, I also believe the Android platform is not for the weak, the old nor the naive. My sister-in-law is a really nice lady but technology-illiterate. She switched from iphone to android, based on bad advice from her 17 year old daughter. What a disaster. She just couldn't get it figured out because "it had too many options." I love those options.
The S6 looks awesome and I'd be happy to get that one.
Finally, don't look beyond the stable of Microsoft phones and their UI. I know times have changed when my 16 year old and 13 year old BEG for the Surface Pro3 over the MacBook because of the functionality. I bought the SP3 for my 16year old and she uses it every day for school - writing notes with the pen, netflix, etc. I use a dell venue Pro 11 and the new MS platforms should not be underestimated. If Apple doesn't move into touchscreen laptops, they'll become a very good phone and tablet company and miss this huge shift in the market.
I sure hope this new model can turn around their declining profit margins. They really need this now.
“So here is an interesting question how many cores is your compiler using, be careful, how many did you pay to have unlocked?”
I don’t understand the question :-) ... Xilinx does a good job using multiple cores during mapping and place and route. I can enable 2 cores during the mapper, up to 4 during place and route (I might have that reversed ...)
OK, I think I see your question :-). My machine at work is a 4 core Xeon, so I paid for 4 cores and have been using all of them from day one :-) . I usually have two running during builds, one open for simulations (I use Modelsim PE ... no parallelism or 64 bit support ... it’s torture), and one open for “other stuff”.
Prior to that, I had some kind of quad core Intel processor back before Xilinx supported multicore builds. Yes, in that case, I had two cores idle most of the time :-). So I paid for 4, but only used two most of the time. One core for work, the other for Free Republic, I mean, “other stuff” :-). The others mainly sat idle.
We’re actually exploiting the two ARM CPU cores in a Zynq right now FWIW (not just OS scheduling processes ... I mean real parallel processing stuff :-) )
I do most of the hardware acceleration for our fairly complex algorithms. I’m trying to get back into the software world somewhat just to get a better feel for other stuff that might benefit from hardware acceleration ... I’ve grown tired of beating back software people that seem to want their main program to be an initialization routine while the logic does everything else :-).
I’m going to be building a new Haswell-E system for home/contracting use ... I went the 6 core, 3.5GHz route for that.
Again, I certainly see your point, but I’m also seeing software people get a LOT better using multiple CPUs ... I don’t think it’s a lot of hype anymore (it certainly was not too long ago).
“PPI on the Samsung is far greater. The Samsung has 8 cores versus Apples 2.I could go on and on.
Ah... a Samsung fanboi outing himself!
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.