nrichards

Members
  • Content count

    10
  • Joined

  • Last visited

  • Days Won

    1

Community Reputation

1 Neutral

About nrichards

  • Rank
    Community Whiz Kid
  1. Would it be possible to provide an API call or calls that provide a 'hit count' (historical and current) against alert rules and escalation chains? Ideally it would allow a filter to be assigned for alert levels of interest. This would help in providing metrics around how many alerts are being generated, and to what areas of responsibility, and help drive additional questions around configuration and maintenance. I know there is a report to extract thresholds and their destinations, but these metrics are not available currently, it seems. Many Thanks ~Nick
  2. Whilst LogicMonitor support two-factor authentication (https://www.logicmonitor.com/support/settings/users-and-roles/two-factor-authentication/), beyond phone call and SMS options, it appears be limited to Authy as a provider. Whilst they're a perfectly good vendor, it would be useful if it were possible to configure Multi-Factor Authentication across a variety of providers. Of most interest to me right now is Azure MFA. Is this something that is available already (though undocumented)? If not is this something that could be factored into the release cycle? Many Thanks ~Nick
  3. Alerts if there are X failures over Y time

    I second this request. The ability to incorporate a time-based / duration based metric for datasources such as CPU / Memory usage (especially) would be really useful. We would like to be able to implement this for scenarios such as: If Device A breaches the configured CPU threshold for more than 1 hour, generate an alert. If it breaches for less than an hour, do nothing. We have different application teams, that would want the duration to be customisable, in line with their applications behaviour. Is this something that is in the Development teams backlog?
  4. Programmatically generating datasources via API

    Hi Sarah, we have a number of systems that have third party components installed, to add additional functionality into services like Lync and Exchange. These applications have their own performance metrics. As we're moving to LogicMonitor as our foremost monitoring solution, it's important to have the same (or as near to) coverage within the estate. I had hoped to be able to compile the classes and counters required outside of LM (csv, JSON), and then construct the datasource via a loop, rather than crafting one at a time. Many Thanks ~Nick
  5. Is there any particular reason why there is no documented process for programmatically generating datasources via the Rest API? https://www.logicmonitor.com/support/rest-api-developers-guide/datasources/ Is it in the pipe-line at all? Many Thanks, ~Nick
  6. Import Datasource via API and PowerShell

    Hi Sarah, Thanks for this. My import is now working following the example you provided. Though I am getting some formatting issues for the Collector Attributes > Groovy Script section. I don't know if its actually an issue yet, but imagine that it might be. I understand your comment completely; We're looking to commit our internally amended / created data sources into version control, and import them into our production instance under change, hence the desire for this. Many Thanks, ~Nick
  7. Hello All, I'm attempting to import datasources into our LogicMonitor instance, using the API and PowerShell. The following documentation only provides a CURL example, which isn't really sufficient for us. https://www.logicmonitor.com/support/rest-api-developers-guide/datasources/import-datasources-from-xml/ Usage: Import-LMDatasource -Credential $credentials -FilePath 'c:\repositories\LogicModules\DataSources\CustomDataSource.xml' I am assuming that you have an array containing your accessId, accessKey and company name, prior to calling the function. The parameter FilePath is the full path to the XML file to be uploaded. function Import-LMDatasource { [CmdletBinding()] param ( [Parameter(Mandatory = $false)] [array]$Credential, [Parameter(Mandatory = $false)] [ValidateNotNullOrEmpty()] [string]$FilePath ) Begin { if(-not($Credential)) { $accessId = Read-Host -Prompt "Please supply accessId:" $accessKey = Read-Host -Prompt "Please supply accessKey:" $company = Read-Host -Prompt "Please supply company:" } else { $accessId = $Credential.accessId $accessKey = $Credential.accessKey $company = $Credential.company } $httpVerb = 'POST' $resourcePath = '/setting/datasources' $queryParams = '/importxml' $data = '' $url = 'https://' + $company + '.logicmonitor.com/santaba/rest' + $resourcePath + $queryParams $epoch = [Math]::Round((New-TimeSpan -Start (Get-Date -Date '1/1/1970') -End (Get-Date).ToUniversalTime()).TotalMilliseconds) $contentType = [System.Web.MimeMapping]::GetMimeMapping($FilePath) } Process { $requestVars = $httpVerb + $epoch + $data + $resourcePath $hmac = New-Object System.Security.Cryptography.HMACSHA256 $hmac.Key = [Text.Encoding]::UTF8.GetBytes($accessKey) $signatureBytes = $hmac.ComputeHash([Text.Encoding]::UTF8.GetBytes($requestVars)) $signatureHex = [System.BitConverter]::ToString($signatureBytes) -replace '-' $signature = [System.Convert]::ToBase64String([System.Text.Encoding]::UTF8.GetBytes($signatureHex.ToLower())) $auth = 'LMv1 ' + $accessId + ':' + $signature + ':' + $epoch Add-Type -AssemblyName System.Net.Http $httpClientHandler = New-Object System.Net.Http.HttpClientHandler $httpClient = New-Object System.Net.Http.HttpClient $httpClientHandler $httpClient.DefaultRequestHeaders.Authorization = $auth $packageFileStream = New-Object System.IO.FileStream @($filePath, [System.IO.FileMode]::Open) $contentDispositionHeaderValue = New-Object System.Net.Http.Headers.ContentDispositionHeaderValue 'form-data' $contentDispositionHeaderValue.Name = 'file' $contentDispositionHeaderValue.FileName = (Split-Path -Path $FilePath -Leaf) $streamContent = New-Object System.Net.Http.StreamContent $packageFileStream $streamContent.Headers.ContentDisposition = $contentDispositionHeaderValue $streamContent.Headers.ContentType = New-Object System.Net.Http.Headers.MediaTypeHeaderValue $contentType $content = New-Object System.Net.Http.MultipartFormDataContent $content.Add($streamContent) $response = $httpClient.PostAsync($url, $content).Result if(!$response.IsSuccessStatusCode) { $responseBody = $response.Content.ReadAsStringAsync().Reult $errorMessage = "Status code {0}. Reason {1}. Server reported the following message: {2}." -f $response.StatusCode, $response.ReasonPhrase, $responseBody throw [System.Net.Http.HttpRequestException] $errorMessage } return $response.Content.ReadAsStringAsync().Result # $httpClient.Dispose() # $response.Dispose() } End { } } Result: {"errmsg":"Request content is too large, max allowed size is 10240","status":1007} Dot Sourcing the function at runtime allows me to inspect the variables set during execution: $response Version : 1.1 Content : System.Net.Http.StreamContent StatusCode : OK ReasonPhrase : OK Headers : {[Date, System.String[]], [Server, System.String[]]} RequestMessage : Method: POST, RequestUri: 'https://<instance>.logicmonitor.com/santaba/rest/setting/datasources/importxml', Version: 1.1, Content: System.Net.Http.MultipartFormDataContent, Headers: { Authorization: LMv1 <ACCESSTOKEN-OBFUSCATED> Content-Type: multipart/form-data; boundary="17ba48f6-b5e9-48c6-9002-7544d99025d8" Content-Length: 11858 } IsSuccessStatusCode : True Uploading the exact same datasource XML via the GUI works. I think I'm most of the way there, but obviously something I'm doing is bloating the request size and tripping this limit. Has anyone had any success with uploading datasources via the API? Many Thanks, ~Nick
  8. Effective Disk Monitoring

    Is it possible to create a disk threshold that aggregates multiple conditions before sending an alert? The scenario I have in mind, is where a disk might be 90% full, which would ordinarily trigger an alert, but due to the size of the disk (say 1TB), there is 100GB remaining. I would like to be able to build a threshold that allows us to check if a disk is over 90% and has less than 5GB free space remaining, then alert. Has anyone had any success in doing something similar? Many Thanks, ~Nick
  9. Twitter List Integration

    Hi All, I was recently asked, whether or not it was possible to integrate/ember a Twitter list or timeline, into a dashboard view. My immediate thoughts are that it would be, using the HTML widget, but has anyone out there had any success doing so? Also if you have, are there any pitfalls/gotchas to avoid doing so? Many Thanks, ~Nick
  10. Nested/Cascading Dashboards

    I've been looking at how best to represent the status of a service / application using Dashboards within LogicMonitor. It would be really useful if we could nest/cascade dashboards, so that you could represent items at a very high-level (e.g. overall status of your datacentre infrastructure) and then be able to drill down through underlying dashboards, etc. I've found that a similar functionality request was submitted some time ago by another member, but that was back in 2014.