Skip to main content

OS X Process-Fu 101

OS X is of course a variety of BSD Unix, not Linux. Often is the case where Linux commands either don't work, or don't give the same results on OS X as they do on Linux. For example, a basic
netstat
will give you a long list of the processes making network connections on your box. On Linux, various command line options can help you drill down from there, to get the answers to more specific queries. On OS X, those same queries don't give you the answers you're seeking. This isn't a compare and contrast article between Linux and OSX. Instead, I'm just noting various commands that can give answers to some basic questions. On OS X, for example, to get an answer to the question: "What process is running on port 50224?" we would use:
sudo lsof -Pn | grep 50224
This will give you a list of everything connected to port 50224, sans any kind of headings. A much more abbreviated command that will give you essentially the same things, but with column headings as well, is:
sudo lsof -i :50224

Now, suppose we wanted to get a listing of all processes making network connections from our (OS X) box, including their Command Name, PID, Type of connection, and the Port they're listing to (along with other info)?
lsof -i -P | less
This command gives all this, with a heading at the top. Piping to less is optional.
Another thing that often comes up is finding out if a certain process is running, let's call it foo:
ps aux | grep foo
This will show whether or not foo is indeed running, and if it is, what its PID is. From there, if *foo* is unwanted, it can be killed with:
kill -9 foo
Just as a note: in some cases, you may need to:
sudo
the above commands.

Comments

Popular posts from this blog

Install current SBCL on OS X

You must have Command Line Tools installed. If you don't , this tutorial is not for you. Google: installation of XCode and Command Line Tools. Normally, I use brew to install things (when it offers a solution), but in this case the keg version was a couple minor version's off. And, there had been sufficient addition's that motivated me to want the current release. So, building from source was the path of least resistance. First, what not to do : The note's caution against using OS X's Terminal , as their make.sh script pukes a shit-ton of text during the build, and according to them, it can slow the build. I did not experience an issue with this, compared to other builds I've done in the past.   BUT , they also say build can be accomplished with other LISP's installed (you must have a lisp installed prior to building). OMFG , unless you want to wait a month of Sunday's, my experience building with CLISP was slower than the Molasses in January.  D

React Simplicity

This is just a quick intro to React to show how easy it is on a very basic level. React is often compared to Angular, but the two are very different: Angular is more of a framework , whereas React is more of a library . So, with React, we can make Components, and in so doing, we can intersperse plain Javascript to instill behavior. This article is not showing (or using) best practices, or a recommended structure. It's purpose is only to show how easy the basic mechanics of React are. Let's grab the getting started cli from React's page npm install -g create-react-app create-react-app my-app cd my-app npm start After this is done, and you have the project displayed in your browser, let's experiment. A boiler-plate header we can use for each new class can be as simple as: /src/Foo.js import React, { Component } from 'react'; class Foo extends Component { render(){ return(); } } export default Foo; So, all that we need to change to get st

Screen Scraping in Ruby with Watir and Nokogiri

I was given an interesting challenge to scrape some data from a specific site.  Not to write a completed, packaged solution, but rather just to scrape the data.  The rub being, the site uses Javascript paging, so one couldn't simply use something like Mechanize.  While a self-contained product would require inclusion of V8 (as the Javascript would need to be run and evaluated), to just scrape the data allows making use of whatever is easy and available.  Enter Watir . Watir allows "mechanized/automated" browser control.  Essentially, we can script a browser to go to pages, click links, fill out forms, and what have you.  It's mainstay is in testing, but it's also pretty damned handy in cases where we need some Javascript on a page processed... like in this case.  Keep in mind though, it is literally automating a browser, so you'll see your browser open and navigate to pages, etc. when the script runs.  But, there is also a headless browser option.  This is