Skip navigation

Tag Archives: ISE

A couple of days ago, I talked about using the ISE command line to create script lines in the active script tab.

My process was:

  1. Switch to the email program,
  2. Copy the job name or full line out of the error email,
  3. Switch back to the PowerShell ISE, and
  4. Up-Arrow to create each new line

That was four steps – FOR EACH LINE.

That’s WAY too much like work. If only I could get the entire list at once…

Well, actually, I can.

The unique job name that I copied out of the error emails for each line is also the name used for the directory containing that job’s processed files on our processing server. I can build the query using the name of the directory, instead of copying each problem job name out of the error email.

But how do I know which directories contain jobs that generated an error email?

Actually, it’s better not to know. We’re better off running queries against ALL the jobs – that way, we can catch all the jobs that failed to get ingested into the full-text indexing/search app we use, even if the respective error email goes missing.

So to create a script that runs a query against every job processed in a day:

a. Start the PowerShell ISE – if the current script (file) tab isn’t empty, create a New one
b. In the ISE command line pane:
PS P:\> $PCE = $PSISE.CurrentFile.Editor
PS P:\> $solr_cl = ‘jruby solring.rb Cust-37 PROD hits MEMBERS “20160916 ‘
PS P:\>   # $solr_cl is the command line for the bulk of the Solr query –
PS P:\>   # oh, and it’s not quite the query that I use – that’s proprietary info, of course
PS P:\> $PCE.Text += Get-ChildItem <path-to-data>\2016\201609\20160916 |
ForEach-Object { $solr_cl + $_.PSChildName.ToString() + ‘”‘ + “`n”}
PS P:\>

So instead of taking four steps for each line of the query script, I can create the entire query script using just three steps. PowerShell FTW!

But That’s Not All! You Also Get…

Sure, it’s great that I no longer have to copy lines out of individual emails to create a hits query script for all the jobs processed in a given day, but what about creating the addonly query to ingest the missing jobs into Solr?

Well, as it happens, our Solr services were down on 9/16, so all the jobs failed and needed to be added. Also as it happened, there were 121 jobs processed on 9/16, so the hits query was 121 lines long.

The MEMBERS parameter in the Solr command line for the hits query corresponds to a MEMBERS.TXT file that contains web menu lines used by our web app. Each line of an addonly query uses the same format as the lines in the MEMBERS.TXT file.

So, to create the addonly query, I opened a New (blank) script file tab in the ISE, then entered:

PS P:\> $addonly_cl = ‘jruby solring.rb Cust-37 PROD addonly MEMBERS ‘
PS P:\> Get-Content <path_to>\MEMBERS.TXT |
Select-Object -Last 121 |
ForEach-Object {$PCE.Text += $addonly_cl + $_ + “`n”}
PS P:\> # names have been changed to protect proprietary information

But Wait! There’s More!

So I’ve cut down the process of creating these Solr scripts from 4 steps per line to 2 or 3 steps for the whole script – but what about the output?

Previously, I was watching the results of each query, and logging each result in our trouble ticket system. I built in a 15-second delay (using Start-Sleep 15) for each line, and bounced back from our trouble ticket system (in a web browser on my PC’s desktop) to the processing server where I had to run the query.

Again, that’s WAY too much like work.

The results of each hits or addonly query are logged to individual text files in a directory on the processing server. This directory is not shared – however, the processing server can send email using our processing mail server.


  • I connected to the processing server (using Remote Desktop),
  • ran the hits query (to verify that all the jobs needed to be ingested using the addonly query)
  • ran the addonly query and watched as each job seemed to be added successfully, then
  • ran the hits query again (starting at 5:45 PM) to verify successful ingestion

I then used PowerShell to create and send a email report of the results:

PS C:\> $SMTPurl = <URL of processing email server>
PS C:\> $To = “<>”, “<>”
PS C:\> $From = “<>”
PS C:\> $Subject = “Cust-37 hits for 20160916 <job#1>..<job#121> (after addonly)”
PS C:\> $Body = “”
PS C:\> $i = 0
PS C:\> Get-ChildItem <path-to-log-files>\solr_20160916*.log |
Where-Object{ $_.LastWriteTime -gt (Get-Date “9/16/2016 5:45 PM”) } |
Get-Content | ForEach-Object {
if ($_ -match “Getting” ) { $Body += ($i++).ToString() + “: ” + $_ + “`n”}
if ($_ -match “Number of hits found” ) { $Body += $_ + “`n`n" }
PS C:\> Send-MailMessage -SmtpServer $SMTPurl `
-To $To -From $From -Subject $Subject -Body $Body

Shazam! I (and the other tech in this project) got a nice summary report, by the numbers.

What’s next?

Well, these were all commands entered at the PowerShell command line, either in the ISE (on my desktop) or in a regular PowerShell prompt (on the processing server). One obvious improvement would be to create a couple of script-building scripts (how meta!) that I (or someone else) could run to create the query scripts, and a separate script (to be run on the processing server) to generate the summary email.

What if (as is usually the case) only a few of the jobs need to have addonly queries created to be re-ingested? Well, the brute-force way would be to create the addonly query with all the jobs included, then manually edit it, deleting all the lines where the initial ingestion was a success.

But the slick way would be to scan the query results log files, get the job numbers of the jobs that failed to ingest, and pull only the corresponding lines from the MEMBERS.TXT file.

(Spoiler: one way would be to append the name of each failed job to a single string, then get the contents of MEMBERS.TXT, extracting the lines that -match the string of failed jobs, and use those lines to create the addonly query.

It might be faster, though, to hash the lines of MEMBERS.TXT with the job number as the key and the entire line as the value, then return the entire line corresponding to each failed job).

Something that’s been bugging me for a while is that I couldn’t remember how to continue lines in the ISE command pane.

Of course, in a regular PowerShell window, you can just press Enter to continue entering a command on the next line:

PS C:\Users\Owner> 1..10 | Where-Object{ ($_ % 2) -eq 0 }
PS C:\Users\Owner> 1..10 | Where-Object{
>> ($_ % 2) -eq 0 }

You can press Enter after a “|” to start the next section of pipe on the next line, you can press Enter after a “,” when entering a list of items, you can press Enter after an opening bracket “{” when starting into a script block, and you can press Enter immediately after entering a “`” (a backtick) anywhere within a line (except inside a string literal).

But if you try to use Enter in the command pane of the ISE, you’ll get an error. For example, when I try to break a line after the opening bracket of Where-Object (as in the example above), I get:

PS P:\> 1..10 | Where-Object {
Missing closing ‘}’ in statement block.
    + CategoryInfo…

Very annoying.

I finally (after a couple weeks of suffering along without continuation lines in my recent orgy of ISE work) tracked down the keystroke needed – it’s Shift-Enter in the ISE.

Enter in the regular prompt, Shift-Enter in the ISE. Enter in the regular prompt, Shift-Enter in the ISE. Enter in the regular prompt, Shift-Enter in the ISE…

One of the things I do at work involves creating scripts to run a Ruby script. For each line in each script I create, I have to:

  1. Go to our secondary email program
  2. Copy a job name (a word, basically) or an entire line from an error email
  3. Change to an editing program (like the PowerShell Integrated Scripting Environment, a.k.a. ISE)
  4. Create a line with the Ruby command line and space for the text I’ve copied from the error email, and add the error email text to that line

Today, I had almost 120 lines to create this way – some of them in two versions

Previously, I did it all by hand – I duplicated the Ruby script portion as many times as I needed, then copied and pasted the error text.

Today, though, contemplating the 200+ lines to create, I decided to dig a bit deeper into the PowerShell ISE.

I discovered I could open a PowerShell script file from disk using:

PS P:\> New-Item -ItemType File test0.ps1
PS P:\> $PSISE.CurrentPowerShellTab.Files.Add(“P:\test0.PS1”)

I then found I could access the open files in the tabbed script panes using standard array indexing notation:

PS P:\> $PSISE.CurrentPowerShellTab.Files[7]                # or [0], [1], [2], etc

With a little more experimentation, I found I could assign the tabbed script to a variable, Save its contents from the command line, and update the Text in the Editor property of the script pane:

PS P:\> $file0 = $PSISE.CurrentPowerShellTab.Files[7]
PS P:\> $file0.Editor.Text = “hello, world”
PS P:\> $file0.Editor.Text += “`n” + “Goodbye, cruel world”
PS P:\> $file0.Save()
PS P:\> Get-Content P:\test0.ps1
hello, world
Goodbye, cruel world

PS P:\>

I then discovered that I could access the current tab directly, without having to use the array indexing notation, or assign the tabbed script to a variable:

PS P:\> $PSISE.CurrentFile.Editor.Text = “”

I then adapted Recipe 8.3 (“Read and Write from the Windows Clipboard”) from the Windows PowerShell Cookbook to write a one-liner:

PS P:\> function Get-Clipboard { Add-Type -Assembly PresentationCore; [Windows.Clipboard]::GetText() }

Finally, I put the Ruby script lines into variables (for example, $ruby_script_1),
defined a new variable $PCE:

PS P:\> $PCE = $PSISE.CurrentFile.Editor

And used the results to add lines to the currently selected script tab:

PS P:\> $PCE.Text += $ruby_script_1 + (Get-Clipboard) + “`n”
# the `n is the PowerShell way to specify a newline character

Now, I just had to

  1. Switch to the email program,
  2. Copy the job name or full line out of the error email,
  3. Switch back to the PowerShell ISE, and
  4. Up-Arrow to create each new line

It looks like the same number of steps – but there’s a lot fewer keypresses, so…WIN!!!