Re: Automator-users Digest, Vol 63, Issue 5
Re: Automator-users Digest, Vol 63, Issue 5
- Subject: Re: Automator-users Digest, Vol 63, Issue 5
- From: Thierry Willemsens <email@hidden>
- Date: Thu, 21 Apr 2011 10:54:05 +0200
Hi Roy,
thanks for your answer.
In fact you are right, I can simple use this option andI have a .txt file with all the urls.
It is only the first step of the problem, because I would like to be able to open each of these URLs, to get the text of each page.
This is the reason why, I've done some changes, but the use of a variable could be the solution.
2) Get Link URLMs from Webpages
3) Filter Paragraphs : Return paragraphs that begins with
http://www4) New Text File
I've done a filter paragraphs but I don't know how to separate each url included into the text file.
When I use the action : Set value of the variable, I would like to have only one url and not all of them.
If it's possible than I would like to open each URL, copy the text into the page of this URL and save it as a text file with fields separated with a ,
add the 2nd url result into the same text file, until the end.
Finally with this text file I could be able to import, the fields I need into FileMaker Pro.
The database is already done, but empty.
I hope everybody can understand it.
Thanks again for your help, I am still searching for a solution.
Regards,
Thierry
2011/4/21 Roy Whelden
<email@hidden>
Thierry,
If I understand this correctly, you could simply use "new text file" as step 3 below.
HTH,
Roy Whelden
On Apr 20, 2011, at 12:01 PM, email@hidden wrote:
Hello everybody,
I try to recover the text from a link into a web page.
A picture says more than 1000 words, here is the page :
http://tinyurl.com/4yqon22
As you will see, I just need the links with the name and address in the
middle of the page.
Maybe it's better to copy these links into a TextEdit document to avoid the
others?
I have done a small workflow.
1) Get Current Webpage from Safari
2) Get link URLs from Webpages : no problem I've got a list as result but
with all the links from the page.
3) Get text from Webpage. This is where my MBP was lost because it was
running all the time with no results...
4) New text file
Is there a way to recover the informations about these links and for each of
them : the name ( above in red ) and the different fields into the middle of
the page.
Sorry for my poor english, I speak french.
I thank you in advance for your answer.
Thierry
--
---
Think different
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Automator-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden