Thứ Tư, 28 tháng 3, 2018

Waching daily Mar 29 2018

Best songs for Playing Fortnite Battle Royale #126 | 1H Gaming Music Mix | Fortnite Music NCS 1 HOUR

For more infomation >> Best songs for Playing Fortnite Battle Royale #126 | 1H Gaming Music Mix | Fortnite Music NCS 1 HOUR - Duration: 1:03:38.

-------------------------------------------

Preparing for your Filipina lady arriving in Australia - Duration: 7:30.

Preparing for your Filipina lady arriving in Australia

The grand day has arrived!

Your Filipina lady has her Australian visa grant!

Could be a Partner Visa or a Prospective Marriage Visa,

or maybe it's the first time she's visited on a

Tourist Visa.

We're talking about the moment when your lady from the Philippines first

steps across your threshold and enters your home in Australia.

That means it will be her home too, either for a first three month visit

or for many years ahead.

Whichever way it is, something monumental is about to happen and

your life will never be quite the same again.

Living together as an Australian Filipina couple

Quite a number of our Australian sponsors haven't lived with a lady before.

Many "never married" chaps out there.

And some of you have been divorced and living as bachelors for

enough years to forget that life was ever any different.

And.....well....life with a Filipina is usually a little different to life with local

ladies.

Why?

Mainly because Filipinas are a little bit more domesticated and take their wife role

a little more seriously than many a modern Aussie woman

does.

That's what I think.

Not just from the housework perspective, but the whole wife-and-mother

role and the set of duties and expectations that go with it.

Don't expect her to arrive and plonk herself on the couch to

watch things happen around her.

She will have very clear ideas about how things should be,

and she will generally get stuck into things once the shyness wears off.

Your Filipina lady and housework

In my experience, we spent two days cleaning our house top to bottom.

Had the bonfire going for two days too.

Mila arrived and spent a week redoing what we thought we did a

fantastic job on.

Apparently you DO need to clean under beds.

Didn't know that.

Look, if your lady is anything like Mila (and I hope she is, as you will be a lucky many

like I am), she will do this and she won't mind.

Be prepared to let her do it.

Don't expect to sit her down and wait on her, or you will embarrass

her as much as you would be embarrassed to sit on your bum while she changed the car

tyre!

Get what I mean?

Let her take care of you and your house.

Get used to it!

Maybe soon you'll LIKE living in a clean house with healthy

and delicious food, and to look less like a stray!

What you need to expect is some re-arranging and some serious "observation" taking place!

Previous Relationships and your Filipina lady

I had some old letters in a drawer, and in a box somewhere.

Yes, of course Mila found them.

Fortunately they all had dates on them prior to us meeting, so I'm still alive today.

Some ladies have been less understanding, and have

assumed the letters remained because you were still crazy about this woman and kept

the letters on purpose!

Photos, even worse!

If you HAVE kept any memorabilia lying around?

Get rid of it!

If the CSI team from TV couldn't find it, she will!

I remember years ago a silly chap keeping a pair of panties from the

ex.

She found them.

The results were very bad.

And rightly so!

And there's one more topic which extends from this.

Porn!

Filipina attitudes toward pornography and husbands

I touched on this topic in a recent FilipinaWives post on "Mind Reading", some will

remember.

I have no personal stories, happy to say.

But there's many a chap out there who's developed a habit over the years, and

some may still have a magazine collection tucked away somewhere.

Strongest suggestion?

Burn it!

Or as I said to one client (much to his amusement), "Sell the porn collection

and buy a rice cooker".

Fact is, the majority of Filipina ladies will be horrified to see their man drooling over

other women.

It's insulting, and seen as akin to cheating.

It can lead to horrible fights, and in a few times I've seen it come close to breaking

up.

You should be perfectly content with this wonderful lady, otherwise why is she even

there in the first place?

Rice and other food for Filipinas

I've done a few articles on what I think of the average Filipino diet on

www.filipinawives.com.au as many of you know, but right now that's not the main issue.

What IS an issue is you need to consider that she's just arrived and will need time to adjust,

and that change takes time and takes consensus.

Right now you should ensure that she's comfortable, and yes a big part of this is

rice!

So make sure you purchase a RICE COOKER if you don't have one.

Any appliance store will sell you one.

Get something good quality.

And make sure you have things like:

* Soy * Vinegar

* Garlic * Onions

* Pepper * Bayleaves

* Fish sauce * Chicken stock

And of course, a large bag of rice.

That means at least 10kg, preferably 20kg!

500 gram bags of rice don't exist in the Philippines.

Chicken, pork and fish.....a good idea.

Don't expect her to take to lamb too quickly.

And a few cans of sardines and (yes, I'm serious)

Spam will go down well.

Hot dogs too.

Find an Asian store, because she will want to go shopping obviously.

If they have a wallis tingting there ("witches" broom), grab it.

One or two plastic Tabo (plastic dipper for bum-

washing) too.

Next article, I'll explain a bit about friends and social groups, as well as relating to

kids and family.

Be sure to show plenty of patience, OK?

Mila still talks about the efforts I made all

those years ago, so yes it matters.

For more infomation >> Preparing for your Filipina lady arriving in Australia - Duration: 7:30.

-------------------------------------------

Kaden Ford headed to Augusta National for Drive, Chip and Putt National Championship - Duration: 2:23.

For more infomation >> Kaden Ford headed to Augusta National for Drive, Chip and Putt National Championship - Duration: 2:23.

-------------------------------------------

Rearchitecting for the cloud with Robert Venable - Duration: 13:23.

This is Robert Venable.

He likes whitewater rafting, shrimp poboys from the

Old Tyme Grocery in Lafayette, Louisana,

and hanging out with his sons.

He works as a principal architect for Microsoft IT

leading the effort to rebuild data

financial reporting systems in the cloud.

The thing I love about my job is

solving real world problems.

Specifically I get to solve them within finance,

which is historically a risk-adversed field for companies.

I get to provide new capabilities and advancements.

And one of those financial systems is our

revenue reporting system,

commonly referred to internally as MS Sales.

MS Sales is a large data warehouse and analytical platform

built for Microsoft's revenue reporting

that is based on Microsoft SQL server technology

to be in Azure and use some of Azure's capabilities

to make our users happy.

We have to keep 21 years of sales data.

10 years forward-looking, 10 years historical,

and the current year.

We need to do this for compliance reasons.

The people that look at the data want to see

what revenue looked like based upon

the business rules and the organizational structure

in the past as well as what it would like

in the future given the changes within a business rule.

Overall, the MS Sales system has a pretty big system.

MS Sales was originally built on SQL Server

and mainly in a scale up fashion.

So, MS Sales is about 20 years old,

and over those 20 years it had been

enhanced and added on to, which made it more complex.

So some of the code was spaghetti-ish in nature.

We get about 1,500 sources of data through MS Sales.

This includes our channel partners.

We also have multiple billing systems within Microsoft

as well as licensings and product systems.

The system actually integrates all this data together

to give you a view of Microsoft's financial revenue

position across organizations, business segments,

geographic hierarchies, those kinds of things.

When the MS Sales app was built 20 years ago,

customers typically purchased a box of new software from

Microsoft every three to six years.

Over time, the number of incoming transactions

have multiplied exponentially.

The app now operates 'round the clock,

tracking billions of financial transactions.

These could be large purchases, like when a global company

subscribes its workforce to Office 365,

or micro-transactions like when a customer makes

a short call on Skype, or uses a few minutes of

server time on Microsoft Azure.

The MS Sales app has struggled to keep up with the

heavy demands of modern financial reporting.

The uh-oh moment was when one of the development leads

came to me a couple of years ago.

He said, "I think we have a problem with MS Sales

"as it currently is architected.

"The data size and the growth that we see

"based upon the hardware that was available

"doesn't look like it's gonna keep up."

We did a graph of how fast the data,

and we have an exponential data curve,

but we had more of a linear compute curve

from basically the scale out and Moore's Law.

We found out that hey, in 18 months we're gonna

tip over if we don't do anything.

We're not gonna meet our business needs.

That kicked off an effort to find out

what technology stack we could use in the future

to help MS Sales fulfill its needs.

We chose a distributed system, so instead of scaling up

we thought about how can we scale out?

We're going towards more of a modern data warehouse

or a logical warehouse where we try not to hop the data.

We actually try to bring the compute to the data

as much as we bring the data to the compute.

As the clock ticked towards

the demise of MS Sales, the team considered

the best options.

Would they lift and shift the entire system to the cloud

employing infrastructure as a service?

Or would they build something from scratch

using platform as a service fabric

and big data computing solutions?

Whatever solution they picked would have to support

hefty future performance and capability needs.

The solution would also need to take into account

the cost and complexity of re-engineering the app

in a race against time.

The team finally decided on using a Apache Spark

for HDInsight, which allowed for reuse of existing code

but also provided a robust architecture

that could easily scale out.

Spark is a big data processing engine.

It has a couple of different advantages.

One advantage that we like is the in-memory processing

and the other is that I can basically

use the same code and use it for streaming

or use it for batch.

So I'm a firm believer on keeping your options open,

especially when you start down a path

and you don't know exactly where you're gonna end up,

you try to keep as many of the options open

in your back pocket as you can.

As the MS Sales app continued

chugging towards a cliff,

the team seemed to have its solution.

They would use a distributed system based in

Microsoft Azure, which remedied all of the apps

current shortcomings as well as to add

robust cloud capabilities.

Though everyone agreed the solution was

best for the situation, implementing it would require

IT experts to move well outside their comfort zones.

When we moved to open source, there was a couple of

different cultural changes that we needed to embrace.

One was we had a development team that had been

working on MS Sales for a long time.

So what they knew was SQL, they knew it inside and out,

and we needed to move to an open source technology,

and that new technology landscape was scary for them.

It's just a different way of thinking about

the processing, and trying to do that is a cultural change

that you had to make within our own engineering team.

Being that it's open source was just another thing

that was scary because most of them had some C# capabilities

and moving to where we actually ended up, which is Scala,

was daunting to them, it was a cultural change.

From a business aspect, even they knew SQL.

SQL being a Microsoft product, they were able to

open up SQL Server Management Studio and actually

to write a T-SQL statement, and actually view the data.

They were comfortable in what they knew.

Rebuilding an app this big and

important to Microsoft required significant buy-in

from teams across the company.

The team worked hard to earn the trust of key stakeholders.

We can't go dark for 12 months or 16 months

and then say, "Oh by the way, we're here

"two months before we fall over,

"and here's your new system."

So there's a lot of confidence building and trust building

you have to do with both the business side,

with the engineering side.

With MS Sales there was two ways to really do this.

The first way is we took vertical slices of the platform

and tried to move them into the cloud

and to use a different paradigm.

The problem with that was that if I just moved

ingestion or I just moved processing or

just moved distribution, I had really no end-to-end value

and I didn't get to start the cultural change

from a business aspect of what does it mean

when my data is refreshed every 30 minutes?

We decided instead of taking a vertical approach

we tried to take a horizontal approach.

So we took a specific pipeline within MS Sales,

we call it the channel pipeline.

We actually took two, channel and inventory.

But we took that holistic, and so it's a little bit of

ingestion, a little bit of processing,

a little bit of distribution, and we moved that piece

as our pilot phase.

The current model in which Microsoft operated

was more batch ingestion, and we would get a file

once a day, three times a day.

But we would basically batch data through the system.

The really thought process there is

how do we get out of that batch, latent, inherent system

and think about hey, when a transaction hits

an event hub for example, I can process that transaction

from beginning to the end without ever even

landing the data if I want to.

To support current internal systems and partners,

ingestion must allow for batch and streaming methods.

Incoming file transfers land in blob store

and a simple process built in service fabric

validates basic elements of the file.

Number of columns, schema, data types, and more.

The process then streams each row as a transaction

into the event hub.

A copy of the validated data is saved for archival

which allows for auditing in each stage of the process.

The future state will utilize an API for partners

to stream transactions real-time into the event hub

and provide faster ingestion and processing.

It was really about how do we use a lot of

distributing computing versus scale up computing?

It was about how do I make sure that the system

can meet the demands of today as well as

meet the demands of tomorrow?

Historically, we have been running 21 years of data

and our end users would see data every 24 hours.

In the new processing we have reduced that

to be able to process 21 years of data in 42 minutes,

so the end users can actually see fresh data

every 42 minutes.

To test the scale of that, we have increased the data

to 10 times that volume, and our processing time

only increased by 10 minutes.

So even at 10 times the volume of today's 21 years of data,

the end user can see data in 52 minutes.

Where we're going with this is

when a change happens on a business rule,

an event gets fired.

That event is taking and then an analysis of

what transactions are affected by that event are needed.

Then only those transactions actually get

fed through and are incrementally updated.

Currently we're using a Drools basically

as an add-on into the Spark processing pipeline.

For our distribution side, currently today we offer

data marts that people can pivot and see data

the way that they need to see it

to actually figure out what they're trying to solve

or to make decisions for their businesses

or for their specific application.

The team is currently using SQL Database

for distribution, primarily to maintain

backwards compatibility with the client app

used to access the platform.

The transfer time for distribution has become

the bottleneck in end-to-end processing

and the team is implementing Azure Data Warehouse

in combination with changes to the client app

to create a distributed data model

that mimics the design of processing.

It's hard to say that it's one technology.

It's not, it's a lot of different technologies

that make up the end-to-end solution.

I believe that from a user standpoint

they will start seeing the benefits of

data that's refreshed and to them quicker.

I see us adding machine learning into the pipeline

so that we can actually do predictive forecasting,

we could actually get ahead of the game.

Instead of looking at what has happened,

but what is going to happen or how can we make it

happen in the future?

So that's what I see as the future.

We'll address the scale, we'll address the

latency from end-to-end, we'll add the agility

and the componentization that we talked about earlier

which is how can I make MS Sales more agile

to Microsoft's business?

Then the combinability piece is more

how can I combine this normal relational financial data

with other big data elements, whether it be

Twitter feeds or market sentiment or whatever it is,

to actually provide bigger, better value for our customers

whether they be in marketing or

whether they be in finance, right, so that we can say

"Hey, we are going to sell X today,"

rather than, "What did we sell?"

Right now we get about 2.6 million transactions a day,

but we're architecting to do about

10 times that, about 26 million.

We have basically done the things that we

said we were going to do.

It gave us a spot where we could have the business

see the benefits and start thinking about

how their world will change as well as

have the team prove out the technology stack in between.

I believe we can impact not just Microsoft,

but we can impact almost any financial institution

that is risk-averse of moving their information

to the cloud and taking advantage of these capabilities

because they know what worked in the past.

Maybe we can help show them what

it looks like in the future.

The power of the cloud is actually the ability

to not think about your infrastructure

and to light up new capabilities

from both a business standpoint as well as

from an IT standpoint.

It allows us to really focus your investment

onto your core business value,

not the maintaining of servers, to be honest.

That's what the cloud means to me.

It's expanding the capabilities of an organization.

In our 10-part video series

Expedition Cloud, Brad Wright and other

Microsoft technical experts will share the inside story

of Microsoft's journey to the cloud

including proven practices, things we'd do differently,

and advice for customers on the same journey

toward digital transformation.

For more infomation >> Rearchitecting for the cloud with Robert Venable - Duration: 13:23.

-------------------------------------------

Refactoring for the cloud with Darren Coil - Duration: 11:34.

This is Darren Coil.

He loves adventures of all

kinds.

His most prized possession is

his grandfather's rock

collection and his heroes are

his parents.

He's helping transform supply

chain business technologies at

Microsoft.

What I love about my job is the

pace we are able to deploy

really interesting manufacturing technologies

to our own factory at a pace I

have never seen outside of

Microsoft in my 20 plus years

in manufacturing technologies

Microsoft as many people are

unaware is a manufacturing

company. We build Xbox, we build

Surface, we build Surface Hubs,

HoloLens, keyboards and

accessories.

The first big challenge was

understanding the way the

supply chain works at Microsoft.

Everything from the sourcing

the plan, the make, the

delivery, the care, the return,

understanding the scope and the and the logistics. So just

magnitude of our own supply

chain.

What are we doing in our

factories today what

opportunities are there to

improve the way that we build a

product?

What do we know about the

product? where the product is?

the quality the product, the

speed of deploying products?

There's lots of things in

manufacturing that are

important.

Manufacturing tends to be

fairly conservative in the way

they adopt technology.

In our case we have lots of

pockets of data.

So we had an engineering

database over here.

We had a SAP ERP system over

there.

We had the manufacturing

execution system at the

contract manufacturers place

all these different locations all these different types of

of data.

The first problem we're trying

to solve is how do we answer

questions about our business?

For example, if a device came

back and we wanted to know the

history of that device: when

was it made, where it was made,

where did it ship to, who

activated it, why did they

return it?

Just the one serial number

would take us days to go to

each one of those different

data sources pull the data put

it together in a report and

answer one business question.

If you want to keep up in this

market, you've got to answer

these questions faster. You

can't spend your time getting

data,

bing it to the forefront.

answer one business question,

and then go figure out what the

next business question is that

you want to address.

That was the business problem

we were trying to solve.

Let's go connect all these data

sources in one location.

That way we can answer any

question that we may have --

today it's a serial number

tomorrow it's a quality issue.

The day after that it's a

sourcing issue and we don't

have to go spend countless

hours curating, manicuring,

stitching together data.

A year ago we took our

manufacturing operations on a

digital transformation, and

that transformation we broke

down into three waves: a

connected wave a predictive

wave and a cognitive wave.

And the connected wave was

really the first step which is

there's lots of data sources

all around the world get

connected to them all.

We're just connecting to the

data we already have that

turned into a trip to China.

We spent about a month there

and we connected to a dozen

different data sources.

They look at productivity they

look at yield.

They look at outputs.

They look at repairs and

inventory levels.

So productivity is data that

comes from our contract

manufacturer.

They use a Oracle based ERP

system.

And so what we did is we gave

them a data contract fairly

simple flat file format that

says here's the different

fields that we need.

We helped them with a script to

extract the data out of their

system, and then we basically

push that data to the event hub

or two blob storage.

So from the event hub or Blob

storage that then moves into

our Azur data like.

Our partner data flows to us

across the Internet using

encrypted packages into the

event hub or into Blob's George

depending on who's sending the

data and what they're sending.

Our product data flows across

police line also encrypted into

our storage containers and then

into your data lake.

For our network design.

It comes down to the data that

we're streaming off of the

location. what does it mean?

How important is it? what's its

time sensitivity?

For example, we just connected.

Process Equipment at the

factory.

And so we're using a IOT

gateway to azure connection to

stream data live into the

factory and then turn that data

back around to make real time

decisions, so the round trip is

seconds.

We just took the basic things

that they trust and we that they use everyday reports

automated those reports. We got

them into Power BI we've got

the data moved into the cloud.

we got some basic analytics

behind it and brought all that

data back to the factory.

Our first deadline was to get

an operational control room for

the factory.

The four person team went out

there and in six weeks we were

able to automate their standard

reports.

They looked at everyday

productivity, shipments, yield

quality built him a 10 screen

power be-I visualization room

where they could look at all

the data live all the time.

We did all that in a six week

period before the CEO came out

to see an entire digital

transformation of the way that

that factory was looking at

data, and presenting data, and reviewing data.

We had all that insight in the

slicing available Power BI

immediately because we were

connected to the raw data

source.

We could answer questions about

what happened yesterday or the

week before. We can look at

line to line comparisons. And

all these things were instantly

available and instantly

answered with power BI in the

factory which is what got the

wheel started.

Since we began a year ago,

factory, we've connected to a

dozen of our vendors, we've

we've connected to our primary connected to our delivery

mechanisms that we connect to

our customer service mechanisms,

We were able to do all this

over the last nine months.

That's in a connected phase.

The predictive stage is kind of

where we are now where the data.

Lets us see trends as they are

occurring real time and respond

to them we can dispatch people

to the floor.

We don't have to wait for an

excursion event to happen.

We can see supplier data coming

in we can make decisions about

how much to build, where to

build, where to ship based on

all this data coming to the

surface real time because we

don't have to collect the data

anymore.

So for all of our automated

test machinery we did all of the statistical grading and the

back end so that when it came

into Power BI it had already

had a sort of a ranking as to

whether or not this piece of

data was important or not,

Which then goes into the heat

maps which allows us to find

the data quickly on the machine

level data like a lamination

machine or a trimming machine?

All that data gets

statistically granted actually

in Power BI itself.

So we brought it all the way

through and then we developed

the statistical process control

rules in Power BI And so it

executes SPC on the fly.

So we have both.

The cognitive wave.

It's now there for us 24 by 7.

We've had dozens of

conversations with these

manufacturing operations that

see the same thing they laugh

we run our factory off a we run our factory off of Excel

PowerPoint, and we have the

the same way they say, oh yeah, same challenges. It takes us a

week to answer one business

question. And we show them in

five minutes and we bring up

Power BI and show our factory and we're like look, I can tell

you why I miss production today.

Oh, I am short parts from this

vendor and I have a bunch of

stuff stuck in repair, and we

show them the power of

visualization layer and.

the pace that you can answer

questions and then we go back

and tell them again take

advantage of all of the machine

learning and the AI and the

intelligency insights and let

that then drive your business.

The cognitive wave is where we

allow the machine learning to

solve complex problems for us

and we focus on manufacturing

operations and supply chain

operations.

The difference between

predictive waves and cognitive

waves is more around the

fundamental technologies behind

it.

Cognitive is more about using

machine learning artificial

intelligence.

So to get started we presented

the problem to several data

scientists.

We gave them all of the data

streams. that streams came from

fact information, such as

process yield coming off of the machinery.

We gave them dimensional

information, and this component

came from this physical

location,

The order in which things were

put together, so that the

patterns could then look for

relationships and causality and

create a better optimal

solution for how we built the

Hololens.

Large big data platform

analysis to solve complex

problems not necessarily things

second but things that are that are occurring at this very

occurring over longer periods

of time this sort of analysis

by a human would take weeks and

weeks and weeks, trial and

error lots of computation,

Matlab and those kind of

computation programs to come up

with the same answer.

But by using machine learning

and doing the pattern

recognition we are able to come

up with the answer in just days.

For example optimizing the

material maximum minimum

material conditions of a kitted

device.

Fact information from what was

happening,

Dimensional information in

terms of where things are, and

what order things go together,

it's like a fax sort of a

construct.

And from there they then were

able to use the machine

learning pattern recognition to

come up with the optimal way to

assemble the Hololens, which

improved our yields.

So these are the kinds of

machine learning algorithm

things that will let us get to

a cognitive in a faster pace in

our manufacturing operations.

The big feature for Cognitive

in AI is to tackle problems

that haven't been addressed yet

in manufacturing simply because

there is not enough data

available not enough computation not enough pattern

recognition to be able to do

these things.

You can do other examples where

you look at the way that you

build a device and you feed it

design and experiment and it

can generate better ways to

assemble a device or it will be

able to predict what will

happen in the future if you use

a component a component B.

So far Microsoft has reaped a

number of important benefits

from the continued digital

transformation of its

manufacturing operation.

Darren said the team has

learned some important lessons

as well.

I think the things we could

have done differently we should

have kept the dedicated team

dedicated made this their full

time job.

Of course we didn't know that

it was going to have such an

impact we didn't know that the

value was going to be there.

This digital transformation has

been the largest change in

manufacturing technologies in

30 years and it was probably

one of the easiest changes to

bring to fruition the actual

physical part of connecting the

data and bringing the data to

Azure -and making the Power BI

It was actually very simple.

Like I said, a couple of weeks

to automate some of those basic

reports and a few more weeks to connect to a bunch of machinery

and bring all that data live.

The number one thing keeping

all these companies from doing

this which was the same thing

for us which is business

adoption and change management

that it's the fear of if I go

invest in trying to go on a

digital transformation will my

business accept the answer to

that is that it has to come

from the top down.

When our vice president said

we're going to do this?

He was relentless for the two

or three months that it took

for everybody all the way down

the organization to believe the

data to see the change and to

get on board.

Once you get that change

management started the flywheel

begins and then it perpetuates

it feeds itself.

The beauty of the architecture

is we've moved from systems of

records to systems of records

and a system of intelligence on

top of that. We're not changing

the fact that the data still

exists on the machine or the

data still exists in an ERP

system,

We are simply moving all of

that data up to a system of

intelligence.

We've seen value in

productivity gains,

people not collecting data

anymore.

We've averted product loss in

the tune of millions of

millions of dollars.

We've optimized operations in

and around just the data that

we're getting from the insights

the payback the value to the

implementation, is measured in

days and weeks not years.

In our 10 part video series

expedition cloud, Microsoft

technical experts will share

the inside story of Microsoft's

journey to the cloud including

proven practices,

Things we do differently and

advice for customers on the

same journey towards digital

transformation.

For more infomation >> Refactoring for the cloud with Darren Coil - Duration: 11:34.

-------------------------------------------

Q2 Weather: 10 p.m. with Bob McGuire for March 28, 2018 - Duration: 3:42.

For more infomation >> Q2 Weather: 10 p.m. with Bob McGuire for March 28, 2018 - Duration: 3:42.

-------------------------------------------

Get ready for a Day Hike - Duration: 0:31.

[Upbeat, energetic music starts, continues throughout] Day Hikes: Getting ready is easy and fun...

[Sound of waterproof jacket being thrown on an zipped up loudly]

[Sound of water filling up a drink bottle]

[Sound of sunscreen being squirted and slathered over face]

[Whooshing sound effect with camera movement and drums in music pick up pace]

[Whipping sound effect, then rustle of emergency blanket]

[Whooshing sound effects with zoom in, more rustling as blanket is folded]

[Whooshing sound effects as camera zooms in]

[Sound of laces being tied up]

[Whooshing sound effect with camera movement as items are packed]

[Sound of bag zipping and cord drawing bag closed]

[Triumphant, upbeat music ]

For what to wear and pack, go to www.doc.govt.nz/shortwalksgear

Department of Conservation Te Papa Atawhai

For more infomation >> Get ready for a Day Hike - Duration: 0:31.

-------------------------------------------

Vigil Held For Beaten LA Street Vendor, Highlighting Demand For Vending Legalization - Duration: 1:46.

For more infomation >> Vigil Held For Beaten LA Street Vendor, Highlighting Demand For Vending Legalization - Duration: 1:46.

-------------------------------------------

Get ready for a Short Walk - Duration: 0:36.

[Upbeat music starts, continues throughout] Short Walks: Getting ready is easy and fun...

[Sound of sunscreen being squirted and slathered over face]

[Sandwich being sliced with exaggerated sound effects]

[Sound of waterproof jacket being thrown on an zipped up loudly]

[Sound of water filling up a drink bottle]

[Whooshing sound effects with zoom in, sound of pencil on paper]

[Whipping sound effect, then rustling of jacket, then foot steps]

[Whooshing sound effect, then jacket is loudly unzipped]

[Whipping sound effect, then velcro shoes being done up]

[Whooshing sound effect with camera movement, then rustle of items being packed with added sound effects]

[Zipping of two bags]

[Triumphant, upbeat music ]

For what to wear and pack, go to www.doc.govt.nz/shortwalksgear

Department of Conservation Te Papa Atawhai

Không có nhận xét nào:

Đăng nhận xét