Thứ Sáu, 27 tháng 1, 2017

Waching daily Jan 27 2017

>> Yeah.

>> DREW: How you doing?

How are you?

>> I'm good.

Thank you.

>> DREW: We saved something really nice for you behind this

George.

Go ahead, George.

>> GEORGE: A couple of great prizes: we've got a new hot tub

and a pair of patio heaters!

( cheers and applause ) Experience true comfort in this

luxurious Beachcomber hot tub, which seats five adults and

features 45 customizable jets.

Five deluxe cotton towels included.

Plus, you can keep extra warm under these propane quartz

glass tube patio heaters, which have wheels for easy mobility!

Take it away, Drew!

( cheers and applause ) >> DREW: Thanks, Rachel.

Lipine, this game is called Side By Side.

Right now the numbers are stacked up, but you have the

make them so they're side by side, so it's $8,163 or $6,381

for all those prizes.

$8,163.

Is she right?

Yes!

Well, there you go.

For more infomation >> The Price Is Right - One Side Wins - Duration: 1:19.

-------------------------------------------

Suzumiya Haruhi no Tsuisou - 272: Niku Udon ¥150 x 3 (Part 114) - Duration: 1:52.

Haruhi: How's it going?

At face value this seems like an innocent question, but if you look behind it what she really means is "You don't have any excuses now", right? Sigh, how should I even answer that?

Haruhi: Not that it matters. Your tears will tell the tale later, right?

Kyon: Hey, question? You've been rooting for Koizumi this whole time, right?

Haruhi: I haven't taken sides at all. In fact, if someone asked me to place a bet, I'd put my money on you.

Kyon: Betting on the dark horse?

Haruhi: Yeah, but that's not all. I'm expecting Koizumi-kun will win by a landslide.

Haruhi: But if I bet on you, even if my hunch is wrong, I'd be rolling in so much dough that I wouldn't feel bad about it.

Ladies and gentlemen, I present Haruhi Suzumiya. I can't even be arsed to give a snappy comeback.

Haruhi: How about you, Koizumi-kun?

Koizumi: Right. I guess I'm doing so-so.

Haruhi: Hmm, going smoothly then.

Kyon: Hold it, why are you taking his answer at face value?

Haruhi: Because Koizumi-kun doesn't need lies to boost his reputation.

Ladies and gentlemen, our mighty Brigade Chief. I can't even muster the effort to get angry at this point.

However, going by Koizumi's usual never ending smile and composed mentality, he can probably back it up. I should just let this be over and done with.

As I finished drinking the soup from the niku udon Koizumi treated us to, Haruhi loudly declared the start of round 2.

Haruhi: Alright... Let's get started on the second half. We'll meet back here at, let's say around 4:00. Dismissed, thanks for the meal, Koizumi-kun!

For more infomation >> Suzumiya Haruhi no Tsuisou - 272: Niku Udon ¥150 x 3 (Part 114) - Duration: 1:52.

-------------------------------------------

Healing Walk - Duration: 1:24.

Stop the destruction. Start the healing and that is what this is all about.

We are here to heal the land.

We're here to heal the water.

We're here to do what we got to do to look out for our land.

What we're doing here is no small thing.

Each of us represent so many people that wanted to be here.

So many people that know inside they should be here.

It's enormous.

If one falls!

We all fall!

If one falls!

We all fall!

If one falls!

We all fall!

What's your occupation!

For more infomation >> Healing Walk - Duration: 1:24.

-------------------------------------------

Deep Learning with Tensorflow - Recursive Neural Tensor Networks - Duration: 5:46.

Hello, and welcome!

In this video, we'll provide an overview of Recursive Neural Tensor Networks, as well

as the natural language processing problems that they're able to solve.

Sentiment Analysis is the task of identifying and extracting subjective information, like

emotion or opinion, from a source material.

For example, this might involve analyzing a twitter feed to determine which tweets express

a positive feeling, which express a negative feeling, and which are neutral.

In order to classify sentences into different sentiment classes, we'll need a dataset to

use for training.

One potential dataset is the Stanford Sentiment Treebank.

Each data point is the syntax tree of a rotten tomatoes review.

The tree itself and all the subtrees are labeled with a sentiment value from 1 to 25.

25 is the best possible review, while 1 is the worst.

The dataset was created by Stanford researchers, who utilized Amazon's Mechanical Turk platform

in order to assign values.

Recursive neural models can be used for the sentiment analysis problem.

These types of models are characterized by their use of vector representations.

Vectors are used to represent words, as well as all sub-sentences related to an input's

syntax tree.

The word representations are trained with the model, and the representations of sub-sentences

are calculated with a compositionality function.

To calculate the sub-sentence's representations, we apply the compositionality function bottom-up

according to the input's parse tree.

All vectors are fed to the same softmax classifier to determine the sentiment.

The choice of compositionality function is important, so we'll present three different

types of recursive models, each with a different function.

The first model we'll look at is the basic Recursive Neural Network.

To compute our word composition, we start with our vectors that we want to combine,

which we'll call "b" and "c".

We form a "two d" by "d"

matrix by concatenating "b" and "c".

This new matrix is multiplied by the "d" by "two d"

weight matrix "W".

"W" is the model's main training parameter.

Then a nonlinearity is applied element-wise to the resulting vector.

In this case, the nonlinearity is the hyperbolic tangent function.

As a brief note, we?ve omitted the bias for simplicity.

Other models use this compositionality function, like the recursive autoencoder, and recursive

auto-associative memories.

As you can see, the words only interact implicitly through the nonlinearity, so the compositionality

function may not be consistent with linguistic principles.

The model also ignores reconstruction loss, since the dataset is large enough to compensate.

Now let's move on to Matrix-Vector Recursive Neural Networks.

This type of model is a linguistically-motivated improvement over the basic recursive neural

network.

The big change is that now every word is represented by both a vector and a "d" by "d" matrix.

The compositionality function that you see here takes four objects.

Lowercase "b" and "c" are the word vectors, while the uppercase "b" and "c" are the respective matrices.

Lowercase "p1" is the resulting vector, while uppercase "P1" is the respective matrix.

Just like with basic recursive neural networks, a matrix "W" is multiplied with a matrix created

from the word's representations.

But in this case, the matrix created is much more dependent on the relationship between

the two input words.

The problem with this model is that the number of trainable parameters becomes too large

as the vocabulary size increases.

The Recursive Neural Tensor Network, or RNTN, uses a powerful fixed-size compositionality

function that only takes the word's vectors as arguments.

The model is not parameterized by matrices but it adds a "two d" by "d" by "d"

tensor that is used in the function.

This tensor is also trained with the model.

Each of the "d"

slices captures a different type of composition, so intuitively, it is more capable of learning

than the basic recursive neural network.

It turns out that RNTNs outperform the known alternative methods.

It has achieved over eighty-seven percent accuracy in positive negative word classification,

and over eighty-five percent accuracy in positive negative sentence classification on the Stanford

Sentiment Treebank.

This is a sentence classification accuracy that's more than three percent higher compared

to normal Recurrent Networks.

Recursive Neural Tensor Networks can also be used in other applications, such as Parsing

Natural scenes, and Parsing Natural languages.

This is due to the recursive nature of these problems.

If you're interested in learning more about RNTNs, we recommend you follow the link here

to a great article by Socher, and others.

By now, you should understand the intuition behind recursive neural models, and recursive

neural tensor networks.

Thank you for watching this video.

For more infomation >> Deep Learning with Tensorflow - Recursive Neural Tensor Networks - Duration: 5:46.

-------------------------------------------

India vs England 1st T20 Highlights - 26 Jan 2017 - Watch Cricket Highlights - Duration: 14:18.

1st T20

For more infomation >> India vs England 1st T20 Highlights - 26 Jan 2017 - Watch Cricket Highlights - Duration: 14:18.

-------------------------------------------

Don't freak out but, something mysterious is killing 11,000 nearby galaxies - Duration: 4:35.

Don�t freak out but, something mysterious is killing 11,000 nearby galaxies.

According to a recently published study by a global team of researchers, in around 11,000

nearby galaxies, there is SOMETHING that is killing off galaxies. Researchers observed

the galaxies and noticed something that should not be happening. There is a mysterious phenomenon

that is stripping away violently their gas �their lifeblood for the formation of new

stars� on a WIDESPREAD SCALE.

While researchers are still unsure as to why this is happening �and why at such a large

scale� they believe that it has something to do with the halos of DARK MATTER which

are believed to surround galaxies, responsible for removing the star-forming gas in a fast-acting

process referred to as ram-pressure stripping.

The study which was published in the peer-reviewed Journal Monthly Notices of the Royal Astronomical

Society clearly illustrates that this phenomenon is more prevalent than previously thought.

The process basically drives gas from the thousands of galaxies which causes an early

death by stealing from them the material they need to create new stars.

According to Toby Brown, a Ph.D. candidate at ICRAR and Swinburne University of Technology:

�During their lifetimes, galaxies can inhabit [dark matter] halos of different sizes, ranging

from masses typical of our own Milky Way to halos thousands of times more massive. As

galaxies fall through these larger halos, the superheated intergalactic plasma between

them removes their gas in a fast-acting process called ram-pressure stripping. You can think

of it like a giant cosmic broom that comes through and physically sweeps the gas from

the galaxies.�

Simply put, by removing the gas from Galaxies it leaves them unable to form new stars said,

Brown:

�It dictates the life of the galaxy because the existing stars will cool off and grow

old. If you remove the fuel for star formation then you effectively kill the galaxy and turn

it into a dead object.�

Another process which also causes galaxies to die but a much slower scale is known as

strangulation. Brown explained it:

�Strangulation occurs when the gas is consumed to make stars faster than it�s being replenished,

so the galaxy starves to death. It�s a slow-acting process. On the contrary, what ram-pressure

stripping does is bop the galaxy on the head and remove its gas very quickly � of the

order of tens of millions of years � and astronomically speaking that�s very fast.�

Co-Author of the study, ICRAR researcher Barbara Catinella said that astronomers were aware

that the process known as ram-pressure stripping was responsible for the death of galaxies

in great galaxy clusters around which experts think are the most massive �dark matter

halos� in the known universe.

In order to observe 11,000 galaxies, astronomers made use of the largest optical galaxy survey

yet completed � the Sloan Digital Sky Survey � with the largest set of radio observations

for atomic gas in galaxies � the Arecibo Legacy Fast ALFA survey.

Brown concluded:

�This paper demonstrates that the same process is operating in much smaller groups of just

a few galaxies together with much less dark matter. Most galaxies in the universe live

in these groups of between two and a hundred galaxies.�

�We�ve found this removal of gas by stripping is potentially the dominant way galaxies are

quenched by their surrounds, meaning their gas is removed and star formation shuts down.�

For more infomation >> Don't freak out but, something mysterious is killing 11,000 nearby galaxies - Duration: 4:35.

-------------------------------------------

Future life with technology is good and beautiful. This is why I love the future. - Duration: 2:36.

For more infomation >> Future life with technology is good and beautiful. This is why I love the future. - Duration: 2:36.

-------------------------------------------

Is There a Difference Between a Car Crash and Car Accident? - Duration: 1:41.

What I mean by that there are no accidents is, there's an explanation for why everything

happens.

And I think people use the term accident to mean someone didn't mean to do something.

So, to the extent that "it was an accident, I didn't mean to" that's true but when you

look behind why something happened, it's not an accident.

It's something where you could have done something different.

You could have been more safe.

You could have followed the rules.

You could have not been in a hurry.

You could have done things that would have prevented something from happening.

So that's why I say it's a collision, it's not an accident.

It's, you know, it's rare that we've proved that somebody intentionally did something

to someone.

And yes, it's an accident, because you didn't mean to.

And God, you'd take it back if you could.

So, that's why I say there are no accidents, because what our job is to do to help people

who have been injured or their family member killed, is to help them on the front end.

Figure out where they go from here and help them as far as finding whatever resources

there are in the community.

Or their insurance.

Or their health insurance, their disability insurance.

Their life insurance, whatever, help them move forward while we're investigating to

find out why did this happen.

At the McArthur law firm our job is to fight for you to make sure you get justice and reasonable

compensation for your injuries.

To get in touch with us call 1(888) WE-FIGHT or go to our website McArthurLawFirm.com.

For more infomation >> Is There a Difference Between a Car Crash and Car Accident? - Duration: 1:41.

-------------------------------------------

A loved one has just passed away... - Duration: 1:51.

The best way for us, who lose someone through death

is to send our love.

And the way we send our love,

for many of us,

is to pray.

I, I often

like the idea of thinking of that person

who has departed as like someone who traveled

far from you.

So, let's pretend your child

upon, oh, let say,

graduating from high school and before

going to college, your child may be 18, 19,

has a possibility of great adventure,

going far far away,

maybe backpacking in Europe, in a train.

And you still have a relationship with, with your child.

You can call your child.

Right? You can send an email or text.

You can stay connected, even though

you are not physically close to him and her.

And you actually want your child to have a great experience.

So the best way to have to

really cultivate that relationship with those

of us that have a family members and dear friends

who have deceased, have departed,

is to think that you're actually writing a letter,

writing an email,

sending a text, talking on the phone.

That person will receive

your vibrations of love,

your thoughts, your ideas, and your love, as well.

So, it's a great way to understand that that relationship continues on.

Không có nhận xét nào:

Đăng nhận xét