• After 15+ years, we've made a big change: Android Forums is now Early Bird Club. Learn more here.

LG G3 Pre-release/Rumor/Speculation Thread

Status
Not open for further replies.
Hmmm, inserting

It marks the end of independent CPU design by Qualcomm, it's going to spark another RAM arms race (instead of - reduce bloat, reduce the RAM requirements), it's going to usher in a new round of defects, it's going to destabilize the most stable 32-bit code base in mobile use, it's going to hit just as ART stabilizes (at 32-bit) and destabilize that, and when you get a runaway app your phone is going to turn into a space heater.

64-bit CPUs was an Apple marketing stunt because they couldn't step up to quad cores in any way that made sense for rapid redeployment of apps.

The true preemptive multitasking built into your asymmetrical multiprocessing Linux kernel on your Android insulated you from ever needing new apps when we got multiple CPU cores - totally unlike iOS.

The Dalvik Virtual Machine insulated you from further CPU architectural differences and once perfected for 64-bit use, would have insulated developers completely.

Now everything is going to get model and compiler sensitive - which is exactly why Qualcomm surrendered to using ARM Cortex cores. For standardization with others.

But, because Apple has native apps and 64-bit CPUs, welcome to the mobile industry, where it's a good idea to jump off the roof because your friends did too.

And because the blogosphere can't fathom the difference between address space and instruction width, let's all wait for the accolades when the 4 GB barrier for RAM is met and broken, never mind that your RAM power cost at runtime is the same even if you don't use it - only how much you have matters.

And at the end of that day, people will still be complaining about lag on TouchWiz, about HTC only having 12 mega-Ultrapixels, LG trying to save costs on design - but hey, we'll all get to marvel at +100,000 AnTuTu scores.

So, naturally, it's going to be a great improvement.
 
It marks the end of independent CPU design by Qualcomm, it's going to spark another RAM arms race (instead of - reduce bloat, reduce the RAM requirements), it's going to usher in a new round of defects, it's going to destabilize the most stable 32-bit code base in mobile use, it's going to hit just as ART stabilizes (at 32-bit) and destabilize that, and when you get a runaway app your phone is going to turn into a space heater.

64-bit CPUs was an Apple marketing stunt because they couldn't step up to quad cores in any way that made sense for rapid redeployment of apps.

The true preemptive multitasking built into your asymmetrical multiprocessing Linux kernel on your Android insulated you from ever needing new apps when we got multiple CPU cores - totally unlike iOS.

The Dalvik Virtual Machine insulated you from further CPU architectural differences and once perfected for 64-bit use, would have insulated developers completely.

Now everything is going to get model and compiler sensitive - which is exactly why Qualcomm surrendered to using ARM Cortex cores. For standardization with others.

But, because Apple has native apps and 64-bit CPUs, welcome to the mobile industry, where it's a good idea to jump off the roof because your friends did too.

And because the blogosphere can't fathom the difference between address space and instruction width, let's all wait for the accolades when the 4 GB barrier for RAM is met and broken, never mind that your RAM power cost at runtime is the same even if you don't use it - only how much you have matters.

And at the end of that day, people will still be complaining about lag on TouchWiz, about HTC only having 12 mega-Ultrapixels, LG trying to save costs on design - but hey, we'll all get to marvel at +100,000 AnTuTu scores.

So, naturally, it's going to be a great improvement.
Uhm me think it'll rather be a good change other than a negative one. Even with potential upcoming issues, industry and engineers are more than prepared to face and make their products as more better than ever. I guess the rail only moves forward and so far the amount of efficiency is elevating. Not to worried about it really but excited.
 
Uhm me think it'll rather be a good change other than a negative one. Even with potential upcoming issues, industry and engineers are more than prepared to face and make their products as more better than ever. I guess the rail only moves forward and so far the amount of efficiency is elevating. Not to worried about it really but excited.

I'm sure my hardware engineering siblings in the semiconductor industry and my software engineering siblings developing kernels, apps and compilers are every bit as excited as you are. :)

Battery manufacturers are going to be besides themselves with glee.
 
I'm sure my hardware engineering siblings in the semiconductor industry and my software engineering siblings developing kernels, apps and compilers are every bit as excited as you are. :)

Come on, EM, more jobs, more money, more development, higher prices..:)

It all sticks together rather nicely.
 
I'm sure my hardware engineering siblings in the semiconductor industry and my software engineering siblings developing kernels, apps and compilers are every bit as excited as you are. :)
Of course there will be shortcomings but they are necessary for further development lol :). What I am really interested is in the future cross integration with phone as a cross platform for home computer or personal desktop. Just like edge project from Linux (if I'm not mistaken)

These chips are getting incredibly powerful catching up with regular cpu gpu desktop chips. Just like Tegra 1k....
 
Rush special: :D

G3 2GB vs G3 3GB

https://www.youtube.com/watch?v=CTxehFharrI&feature=youtu.be

Performance difference is about 5%.

I am convinced I'd be happy w/2/16 version. But Legere will not let me buy one.

Was it just me,or,did the 3GB model seem a little brighter?
Aside from the possibility that the lighter colored screen frame may have affected my perception,it looked brighter to me.
I would assume the settings were identical on both phones,to rule out any other differences than the RAM.
 
Was it just me,or,did the 3GB model seem a little brighter?
Aside from the possibility that the lighter colored screen frame may have affected my perception,it looked brighter to me.
I would assume the settings were identical on both phones,to rule out any other differences than the RAM.

I've noticed this too, but I think it's the lighting conditions in his test room.
 
:
I've noticed this too, but I think it's the lighting conditions in his test room.

Prolly so,as most of us aren't professional videographers,including yours truly.:)

What I know for certain on such matters would take up less space than this sentence, :D but,I didn't link the extra the extra RAM to a brighter screen.
Just found it odd to release a video w/benchmarks/comparisons on otherwise identical phones to have such a noticeable difference in brightness,& leave some people guessing the possibility exists.
 
don't forget: those automatic tests are designed to squeeze every piece from the RAM and CPU.

5% difference in those tests essentially means that the user will ALMOST never see a difference in the real life situations.
 
:

Prolly so,as most of us aren't professional videographers,including yours truly.:)

What I know for certain on such matters would take up less space than this sentence, :D but,I didn't link the extra the extra RAM to a brighter screen.
Just found it odd to release a video w/benchmarks/comparisons on otherwise identical phones to have such a noticeable difference in brightness,& leave some people guessing the possibility exists.

Amateurs :D

Yet I am glad he did post this video.
 
don't forget: those automatic tests are designed to squeeze every piece from the RAM and CPU.

5% difference in those tests essentially means that the user will ALMOST never see a difference in the real life situations.

The differences in the video were negligible at best.
Nowadays,you'd almost have to go out of your way to find a phone that actually performed less than acceptable.

A co-worker of mine just got one of these last week:

LG Optimus L90 Smartphone | LG Optimus L90 Reviews & Specs | T-Mobile

I have to admit,while I was helping him set everything up,I was a bit envious of the bang-for-the-buck vs the phones I own. ;)
 
The differences in the video were negligible at best.
Nowadays,you'd almost have to go out of your way to find a phone that actually performed less than acceptable.

A co-worker of mine just got one of these last week:

LG Optimus L90 Smartphone | LG Optimus L90 Reviews & Specs | T-Mobile

I have to admit,while I was helping him set everything up,I was a bit envious of the bang-for-the-buck vs the phones I own. ;)

Listen, I paid $79.99+tax for my F6. It does ALMOST all I need.

But I still wanna G3. I deserve it, no? :D
 
Was it just me,or,did the 3GB model seem a little brighter?
Aside from the possibility that the lighter colored screen frame may have affected my perception,it looked brighter to me.
I would assume the settings were identical on both phones,to rule out any other differences than the RAM.

Yeah, the one on the left was a little brighter. I also found these Tmo reviews of the 3/32. All outstanding, of course they are trying to sell you a phone.

LG G3 Smartphone Phablet | LG G3 Reviews & Tech Specs | T-Mobile
 
Arstechnica is another of my favorite review sites. Their review of the G3 sums up my concerns about throttling due to load and ambient temperature. Again, need an actual device to test for myself first, but all indications suggest this device is not for people leaning towards gaming on a device. From that respect, my 32GB S4 or the Note 3 may already be better options in that regard.

Their main theme is some of our own concern: The extra pixels made LG compromise the display's overall image quality (brightness & contrast) and performance. Battery life also appears to take a greater hit with actual heavy use. There appears to be a lot of dynamic display adjustments going on. I turn that stuff off on my other devices. Too distracting for me since I notice it.

If you are a heavy user of web and games, this might not be the best option. For everybody else they should be real happy :)

Looking forward to testing to see if my experience reflects Ars and a few other reviews that actually used the device beyond token use.

LG G3 review: A great phone with way too many pixels | Ars Technica
 
post of the year. period.
It might be like the difference between an atomic bomb and a hydrogen bomb. Either one is plenty bright.......and why are you up at 3:24 am? ( Posting Police )


And a Happy Father's Day to all.

More:

Nits of Brightness:( close measurement, but not exact )

Iphone 5S-562
M8 -490
Galaxy S5-442
LG G2 -450
LG G3 -376

Really not sure how to interpret this. My G2 seems fine in sunlight. I hope the G3 has enough, although I don't spend very much time using my phone in direct sunlight. Who does?

Also, there is this:( from Straits Times )

Our test loops a 720p video on full brightness, but the G3 automatically dims the screen in video playback, so the almost nine hours of battery life is not reflective of actual use. Using the VLC player, which did not dim the brightness, the battery clocked less than four hours, which is below average battery performance. - See more at: http://www.straitstimes.com/digital...hone-the-market-20140612#sthash.TKlxULkc.dpuf

Not sure if there is a separate video playback brightness setting using the G3, as there is in some other phones.
happy father's day to you and everyone! I drive back, in the middle of the night from seeing my father. that's y was up.
If anyone thinks a lot about the nits of brightness ( Steven excluded ) would he then be called a Nitwit?
the older I get the more nitwitty I feel
:)
2
That's a de-lux question.

See what I did there?

Hopefully, final US versions will be brighter.

post if the month.
It might be like the difference between an atomic bomb and a hydrogen bomb. Either one is plenty bright.......and why are you up at 3:24 am? ( Posting Police )


And a Happy Father's Day to all.

More:

Nits of Brightness:( close measurement, but not exact )

Iphone 5S-562
M8 -490
Galaxy S5-442
LG G2 -450
LG G3 -376

Really not sure how to interpret this. My G2 seems fine in sunlight. I hope the G3 has enough, although I don't spend very much time using my phone in direct sunlight. Who does?

Also, there is this:( from Straits Times )

Our test loops a 720p video on full brightness, but the G3 automatically dims the screen in video playback, so the almost nine hours of battery life is not reflective of actual use. Using the VLC player, which did not dim the brightness, the battery clocked less than four hours, which is below average battery performance. - See more at: http://www.straitstimes.com/digital...hone-the-market-20140612#sthash.TKlxULkc.dpuf

Not sure if there is a separate video playback brightness setting using the G3, as there is in some other phones.
I was driving home from seeing Mr farther yesterday. sorry for not being accountable earlier. :)
If anyone thinks a lot about the nits of brightness ( Steven excluded ) would he then be called a Nitwit?
the older I get the more so I become. :)
2 u 3

That's a de-lux question.

See what I did there?

Hopefully, final US versions will be brighter.

nnnope. dream on. when have our hopes ever been realized. .. phone wise? hmmm ??????
 
Arstechnica is another of my favorite review sites. Their review of the G3 sums up my concerns about throttling due to load and ambient temperature. Again, need an actual device to test for myself first, but all indications suggest this device is not for people leaning towards gaming on a device. From that respect, my 32GB S4 or the Note 3 may already be better options in that regard.

Their main theme is some of our own concern: The extra pixels made LG compromise the display's overall image quality (brightness & contrast) and performance. Battery life also appears to take a greater hit with actual heavy use. There appears to be a lot of dynamic display adjustments going on. I turn that stuff off on my other devices. Too distracting for me since I notice it.

If you are a heavy user of web and games, this might not be the best option. For everybody else they should be real happy :)

Looking forward to testing to see if my experience reflects Ars and a few other reviews that actually used the device beyond token use.

LG G3 review: A great phone with way too many pixels | Ars Technica

Hate to admit it..... but I'm strongly considering just getting a M8 for $99.

I can compromise on many things with phones.... battery life during real world usage isn't one of them.
 
Status
Not open for further replies.
Back
Top Bottom